Limb amputations affect a significant portion of the world’s population every year. The necessity for these operations can be associated with related health conditions or a traumatic event. Currently, prosthetic devices intended to alleviate the burden of amputation lack many of the premier features possessed by their biological counterparts. The foremost of these features are agility and tactile function. In an effort to address the former, researchers here investigate the fundamental connection between agile finger movement and brain signaling. In this study each subject was asked to move his or her right index finger in sync with a time-aligned finger movement demonstration while each movement was labeled and the subject’s brain waves were recorded via a single-channel electroencephalograph. This data was subsequently used to train and test a deep neural network in an effort to classify each subject’s intention to rest and intention to extend his or her right index finger. On average, the employed model yielded an accuracy of 63.3%, where the most predictable subject’s movements were classified with an accuracy of 70.5%