Machine Vision Sees Into Chickens’ Futures

Software scans video for early indications of poultry problems

3 min read
photo of chicken
Photo: Peter Chadwick/Getty Images

A jump to the left and a step to the right are signs of healthy activity, as chicken farmers who stroll among their flocks already know. Now a team led by robotics engineer Stephen Roberts at the University of Oxford has found that patterns in the collective motion of a flock of chickens can help farmers predict disease weeks before onset. Call it a chicken time warp.

Roberts and animal-welfare researchers at Oxford first tested their pattern-detection system by asking it to warn farmers before a flock got “peckish." That's not a euphemism for “hungry." Well-fed hens, it turns out, sometimes take out their worm-hunting instincts on one another. The system, which consisted of cameras recording a flock, followed by computer analysis of the footage, beat human experts at flagging the at-risk flocks before the madness took its toll [“Computer System Counters Hen Horrors," September 2010].

Keep Reading ↓Show less

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions

Are You Ready for Workplace Brain Scanning?

Extracting and using brain data will make workers happier and more productive, backers say

11 min read
Vertical
A photo collage showing a man wearing a eeg headset while looking at a computer screen.
Nadia Radic
DarkGray

Get ready: Neurotechnology is coming to the workplace. Neural sensors are now reliable and affordable enough to support commercial pilot projects that extract productivity-enhancing data from workers’ brains. These projects aren’t confined to specialized workplaces; they’re also happening in offices, factories, farms, and airports. The companies and people behind these neurotech devices are certain that they will improve our lives. But there are serious questions about whether work should be organized around certain functions of the brain, rather than the person as a whole.

To be clear, the kind of neurotech that’s currently available is nowhere close to reading minds. Sensors detect electrical activity across different areas of the brain, and the patterns in that activity can be broadly correlated with different feelings or physiological responses, such as stress, focus, or a reaction to external stimuli. These data can be exploited to make workers more efficient—and, proponents of the technology say, to make them happier. Two of the most interesting innovators in this field are the Israel-based startup InnerEye, which aims to give workers superhuman abilities, and Emotiv, a Silicon Valley neurotech company that’s bringing a brain-tracking wearable to office workers, including those working remotely.

Keep Reading ↓Show less
{"imageShortcodeIds":[]}