Vision is a potent source of information, not just for humans, but for robots as well. Processing visual information is a computationally expensive task, one that is often difficult to accomplish in real-time on embedded hardware. In the broad field of visual research exists egomotion estimation, the process of determining self-motion from optical flow. Here we show a technological adaptation and implementation of a complex, biologically-inspired egomotion estimator that has previously only been simulated. Through efficient biologically-inspired signal conditioning and motion estimation techniques, rotational velocity estimations have been achieved at 100 frames per second on embedded hardware within an operating range of 2.7°/s to 72.1°/s with comparable accuracy to traditional sensors. These findings open the possibilities for egomotion estimation on embedded platforms which could be used to complement existing rotational velocity estimators under complex environmental conditions.