158|0

385

帖子

10

TA的资源

一粒金砂(高级)

楼主
 

嵌入式工程师AI挑战营RV1106人脸识别+流水记录(6) [复制链接]

接上回记录

文件路径

Luckfox Pico\示例程序\RKNN示例程序\luckfox_rknn.zip\luckfox_rknn\scripts\luckfox_onnx_to_rknn\sim\retinaface\

Copy一份retinaface.py文件

发现这份文件在rknn.build之前与convert文件是一致的,不同是是build之后一个是进行模型转换,一个是调用rknn.inference进行推理

修改rknn.config的target_platform和dynamic_input后,直接运行了一遍

嘿,可以推理,只是好多float转成int8的警告,最后停在了 

bboxes, kpss = outputs #获取输出数据 

提示是outputs输出太多了,为了想要看到结果打印了outputs,发现结构好像与原来luckfox不一样,这里又走了几天弯路,在等号前加上*-承接多的返回值

但是下面的计算转换还是回报错,头大

最后还是回到output,想着把大象放进冰箱的事就去insightface中找灵感

参照insightface\python-package的readme

insightface中已经进行了封装

app = FaceAnalysis(providers=['CUDAExecutionProvider', 'CPUExecutionProvider'])
初始化了模型
其调用了insightface\python-package\insightface\app下的face_analysis.py文件

app.prepare(ctx_id=0, det_size=(640, 640))
设置了输入模型的大小

faces = app.get(img)
送入图片返回数据

app.get是个关键,其流程大致是app下的face_analysis->model_zoo下的retinaface


retinaface.py是FaceAnalysis人脸检测的核心,
app.get调用了retinaface的detect
detect-》forward,forward中self.session.run进行推理

好了,找到推理部分,那么insightface的推理结果解析是否可以用于rknn的推理结果的解析呢

试试呗,将insightface中self.session.run推理后的代码移到rknn,就有了下面的代码(没整理)

    outputs = rknn.inference(inputs=[infer_img])#, data_format=['nhwc'])
    #print (outputs)
    scores_list = []
    bboxes_list = []
    kpss_list = [] 
    print("forward-------------------") 
    input_height = 640
    input_width = 640
    fmc = 3
    threshold=0.5
    feat_stride_fpn=[8, 16, 32]
    num_anchors = 2
    for idx, stride in enumerate(feat_stride_fpn):
        print(feat_stride_fpn,idx,stride)
        scores = outputs[idx]
        bbox_preds = outputs[idx+fmc]
        bbox_preds = bbox_preds * stride
        kps_preds = outputs[idx+fmc*2] * stride
        height = input_height // stride
        width = input_width // stride
        K = height * width
        key = (height, width, stride)
        print(key,stride)
        center_cache = {}       
        if key in center_cache:
            anchor_centers = center_cache[key]
        else:
                #solution-1, c style:
                #anchor_centers = np.zeros( (height, width, 2), dtype=np.float32 )
                #for i in range(height):
                #    anchor_centers[i, :, 1] = i
                #for i in range(width):
                #    anchor_centers[:, i, 0] = i

                #solution-2:
                #ax = np.arange(width, dtype=np.float32)
                #ay = np.arange(height, dtype=np.float32)
                #xv, yv = np.meshgrid(np.arange(width), np.arange(height))
                #anchor_centers = np.stack([xv, yv], axis=-1).astype(np.float32)

                #solution-3:
            anchor_centers = np.stack(np.mgrid[:height, :width][::-1], axis=-1).astype(np.float32)
                #print(anchor_centers.shape)

            anchor_centers = (anchor_centers * stride).reshape( (-1, 2) )
            if num_anchors>1:
                anchor_centers = np.stack([anchor_centers]*num_anchors, axis=1).reshape( (-1,2) )
            if len(center_cache)<100:
                center_cache[key] = anchor_centers
                
        pos_inds = np.where(scores>=threshold)[0]
        bboxes = distance2bbox(anchor_centers, bbox_preds)
        pos_scores = scores[pos_inds]
        pos_bboxes = bboxes[pos_inds]
        scores_list.append(pos_scores)
        bboxes_list.append(pos_bboxes)
       
        kpss = distance2kps(anchor_centers, kps_preds)
        #kpss = kps_preds
        kpss = kpss.reshape( (kpss.shape[0], -1, 2) )
        pos_kpss = kpss[pos_inds]
        kpss_list.append(pos_kpss)
    #self.forward
    print("bboxes_list---------------")
    print (bboxes_list)
    '''
    print("scores_list---------------")
    print (scores_list)
    print("bboxes_list---------------")
    print (bboxes_list)
    print("kpss_list-----------------") 
    print (kpss_list)
    print("-----------------------")
    '''  
    det_scale=0.5            
    scores = np.vstack(scores_list)
    scores_ravel = scores.ravel()
    order = scores_ravel.argsort()[::-1]
    bboxes = np.vstack(bboxes_list) / det_scale
   
    kpss = np.vstack(kpss_list) / det_scale
    pre_det = np.hstack((bboxes, scores)).astype(np.float32, copy=False)
    pre_det = pre_det[order, :]
    keep = nms(pre_det,0.4)
    det = pre_det[keep, :]
    
    kpss = kpss[order,:,:]
    kpss = kpss[keep,:,:]
    max_num = 0
    print (max_num,det.shape[0])
    if max_num > 0 and det.shape[0] > max_num:
        area = (det[:, 2] - det[:, 0]) * (det[:, 3] -
                                                    det[:, 1])
        img_center = img.shape[0] // 2, img.shape[1] // 2
        offsets = np.vstack([
                (det[:, 0] + det[:, 2]) / 2 - img_center[1],
                (det[:, 1] + det[:, 3]) / 2 - img_center[0]
            ])
        offset_dist_squared = np.sum(np.power(offsets, 2.0), 0)
        if metric=='max':
           values = area
        else:
           values = area - offset_dist_squared * 2.0  # some extra weight on the centering
        bindex = np.argsort(
                values)[::-1]  # some extra weight on the centering
        bindex = bindex[0:max_num]
        det = det[bindex, :]
        if kpss is not None:
           kpss = kpss[bindex, :]
    #self.det_model.detect
    
    bboxes = det  
    kpss  = kpss
    print("bboxes-------------------")   
    print(bboxes)
    print("kpss-------------------") 
    print(kpss)
    '''   
    if bboxes.shape[0] == 0:
        return []
    '''
    ret = []
    for i in range(bboxes.shape[0]):
        bbox = bboxes[i, 0:4]
        det_score = bboxes[i, 4]
        kps = None
        if kpss is not None:
            kps = kpss[i]
        face = Face(bbox=bbox, kps=kps, det_score=det_score)
        '''
        for taskname, model in self.models.items():
           if taskname=='detection':
               continue
           model.get(img, face)
        '''
        #model.get(img, face)
        ret.append(face)
    print("ret---------------")
    print (ret)  
    faces = ret
    img = cv2.imread('./test.jpg')
    rimg = draw_on(img, faces)
    cv2.imwrite("./ldh_output.jpg", rimg)

可以运行,并打印出了结果,但是Y轴为啥偏移了

当然这张图也用rknn自带的推理下,效果如下图

 

 

目前,正在排查坐标偏移的问题

To be continue...
点赞 关注(1)

回复
举报
您需要登录后才可以回帖 登录 | 注册

随便看看
查找数据手册?

EEWorld Datasheet 技术支持

关闭
站长推荐上一条 1/9 下一条

 
EEWorld订阅号

 
EEWorld服务号

 
汽车开发圈

About Us 关于我们 客户服务 联系方式 器件索引 网站地图 最新更新 手机版

站点相关: 国产芯 安防电子 汽车电子 手机便携 工业控制 家用电子 医疗电子 测试测量 网络通信 物联网

北京市海淀区中关村大街18号B座15层1530室 电话:(010)82350740 邮编:100190

电子工程世界版权所有 京B2-20211791 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号 Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved
快速回复 返回顶部 返回列表