 This paper proposes a novel multi-out Gaussian process, MOGP, model and a multi-task deep learning, MTDL, algorithm to simultaneously predict wrist rotation and finger gestures for transradial amputees using a wearable ultrasound array. The results demonstrate that MOGP outperformed previous subclass discriminate analysis methods for both finger gesture and wrist rotation prediction. Additionally, MTDL was found to improve the accuracy of finger gesture prediction while sacrificing some accuracy of wrist rotation prediction. An extended comparison between ultrasound and surface electromyography demonstrated the superiority of ultrasound for prosthetic control. Finally, the authors have made their UltraPro dataset available to the research community to facilitate further advancements in prosthetic control. This article was authored by Shingchen Yang, Yifan Lu, Zongchen Yin, and others.