Meet the IEEE Members Behind Five of Today’s Hottest Startups

This article is part of our September 2015 special report on startups, which highlights IEEE’s efforts to attract more entrepreneurial types to the organization.

Photos center: Limor Fried. Clockwise from top left: Jessica Colaço, Devon Ryan, Milton Chang, Mou Riiny, Earl Bakken, Tan Le, Melonee Wise, and João Barros.
Photos center: Limor Fried. Clockwise from top left: Jessica Colaço, Devon Ryan, Milton Chang, Mou Riiny, Earl Bakken, Tan Le, Melonee Wise, and João Barros.

IEEE members are at the forefront of many emerging areas of new technology, including robotics, the Internet of Things, and devices for neurological research. Some members have already turned their ideas into successful companies.


Tan Le | Company:  Emotiv |Location: San Francisco |Year founded: 2003
Tan Le | Company: Emotiv |Location: San Francisco |Year founded: 2003

If you attended the Consumer Electronics Show in January in Las Vegas and stopped at the IEEE booth, you may have noticed people moving race cars just by thinking about it. Wearing electroencephalography (EEG) headsets like the one above and concentrating on the task, two people at a time would propel toy race cars down parallel tracks, drag race–style.

The brain behind the headsets is IEEE Member Tan Le. She is cofounder of Emotiv, the company that developed Epoc, the EEG headset. It has five sensors that detect brain waves and transmit data wirelessly to a PC. The headset can be used not just to move objects but also to track the wearer’s brain activity. It can measure levels of attention, focus, excitement, stress, and relaxation. Over time, patterns in the data could help doctors detect the onset of dementia and similar disorders. The headset has been sold for US $399 in more than 90 countries.

Le told National Geographic magazine she wants to keep the price down so the headset is accessible to researchers everywhere: “I want to leverage the creativity of researchers in mathematics, statistics, data mining, computer science, biology, and medicine, as well as the general public,” she says, adding, “The brain is the cornerstone of virtually every facet of our lives. I wish we knew more.”


Jessica Colaço | Company:  iHub | Location: Nairobi, Kenya | Year founded: 2010
Jessica Colaço | Company: iHub | Location: Nairobi, Kenya | Year founded: 2010

It takes much more than a good idea to create a successful startup. You need the right people: not just business partners but mentors and investors willing to take a risk on your product. And, of course, you need a physical space to start your business.

IEEE Member Jessica Colaço is well aware of entrepreneurs’ needs. She’s the director of partnerships at iHub, a workspace in Nairobi, Kenya, that technologists, investors, young entrepreneurs, designers, market researchers, and programmers can use to develop what they have in mind. The company provides budding entrepreneurs with Internet access and matches them with local businesses and mentors. It can also help connect them with venture capitalists.

Colaço courts investors to fund iHub members’ tech ideas and companies at the concept stage, before a product even materializes. She says iHub is well placed: “Nairobi is becoming a major technological hub in East Africa. Young Africans are getting trained there to acquire the skills they need to compete globally.”


Melonee Wise | Company:  Fetch Robotics | Location: San Jose, Calif. | Year founded: 2015
Melonee Wise | Company: Fetch Robotics | Location: San Jose, Calif. | Year founded: 2015

Ordering items online and getting them delivered to your door the next day is a luxury that many of us enjoy. But those items often have been sitting in huge warehouses—some, like Amazon’s, take up more than 90,000 square meters. There, logistics workers, or “pickers,” have the exhausting job of running around the building to retrieve items that have been ordered. It can be difficult to keep up.

Enter Fetch Robotics, cofounded by IEEE Member Melonee Wise. Its pair of robots, Fetch and Freight, work in tandem to navigate warehouses and fulfill orders. Fetch, a one-armed, wheeled, autonomous robot, finds items on the shelf and loads them onto Freight, an autonomous delivery cart. With an extendable spine, Fetch can reach shelves up to 2 meters high. Freight moves the picked items to another part of the warehouse, where human workers pack them for shipment. The company has also developed Follow Pick, a mobile app for workers to track orders and locate or control any robot or even a fleet of robots, if the company has several.

Fetch Robotics launched in February and began shipping its products in June. It started with a dozen employees, but Wise told IEEE Spectrum she expects the staff to double by this month. “If there are developers and roboticists out there looking for ‘funemployment’ with robots, I welcome them to apply,” she says.


João Barros | Company:  Veniam | Location: Porto, Portugal | Year founded: 2012
João Barros | Company: Veniam | Location: Porto, Portugal | Year founded: 2012

You’ve heard of the Internet of Things—a network of items, each embedded with sensors, that is connected to the Internet. But what about the Internet of Moving Things?

IEEE Senior Member João Barros is behind the idea, in which vehicles connected wirelessly to the Internet act as a network of moving sensors. His company, Veniam, developed NetRider, a device installed in a vehicle that transmits its location, traffic information, and other data about its surroundings to the cloud. It also offers passengers a Wi-Fi connection while they travel—in essence, a mobile hotspot.

In Porto, Portugal, Veniam has equipped more than 600 vehicles, including a fleet of some 400 passenger buses, with NetRider, providing Internet access to more than 90,000 commuters each month. The company is installing vehicular networks in Barcelona and Singapore and hopes to deploy NetRider in the United States this year, Barros says.

He urges entrepreneurs to use their time wisely. “Every single decision about how you spend your time can determine whether your venture succeeds or fails,” he says.


Devon Ryan | Company:  Lion Mobile | Location: Austin, Texas | Year founded: 2013
Devon Ryan | Company: Lion Mobile | Location: Austin, Texas | Year founded: 2013

You don’t need to be a wine connoisseur to enjoy a glass at the end of the day. That’s the idea behind unWine, a mobile app that educates users about wines and encourages them to post reviews of ones they’ve tasted. The app was developed by IEEE members Devon Ryan and Fabio Gomez. In 2013 they cofounded the software development company Lion Mobile.

Ryan says the app is geared toward young people who want to learn more about wine or simply have fun without being intimidated by experts or sophisticated lingo. Gomez and Ryan met in a programming class at the University of Texas, San Antonio, where they graduated with bachelor’s degrees in engineering in 2012 and 2013, respectively.

Ryan represents the IEEE Young Professionals group on the IEEE-USA Board of Directors. He says the struggles associated with being his own boss have brought out his creativity. “It’s such a huge challenge starting something from nothing that it will force you to reach deep inside and bring out your best ideas,” he tells The Institute. “Even if your company doesn’t turn a profit right away, you benefit from the challenge alone.”

This article originally appeared in print as “Ready to Launch.”


How DIY Electronics Startup Adafruit Industries Became a Multimillion-Dollar Company

IEEE Member Limor Fried started the venture in her dorm room at MIT

Limor Fried, founder of Adafruit Industries, in New York City, with a pick-and-place machine for assembling custom circuit boards.
Limor Fried, founder of Adafruit Industries, in New York City, with a pick-and-place machine for assembling custom circuit boards.

This article is part of our September 2015 special report on startups, which highlights IEEE’s efforts to attract more entrepreneurial types to the organization.

Meet IEEE Member Limor Fried. Her do-it-yourself electronics company for hobbyists has carved out a category all its own and is worth millions. Through the new IEEE entrepreneur initiative, the organization hopes to attract more members like Fried who are developing engineer-inspired ventures.

Fried’s company, Adafruit Industries, makes hundreds of different kits, including ones for electronics experimentation built around an open-source prototyping platform to make gadgets such as smartphones, handheld video games, and a wearable for GPS-enabled clothing so “you’ll never get lost again.” The most popular is the US $19 MintyBoost, a charger for an iPhone or iPad. Its circuit and two AA batteries can fit in an Altoids mint tin.

The company sells more than 2,600 electronic parts, including wire and cable, conductive thread for wearables, multimeters, and breadboards and sockets. It also offers some 800 free tutorials. For the projects, you’ll need a soldering iron and a diagonal metal cutter, which are sold separately. Adafruit’s mission: to make electronics accessible and understandable to everyone. The company, named in honor of English mathematician Ada Lovelace, is “a wonderland of electronics,” Fried says.

As an electrical engineering grad student at MIT in 2005, Fried, now 35, often built her own gadgets, including the smartphone she uses today. Investing $10,000 she had saved up for her tuition into her company and without ever taking a loan or venture capital funding Fried has grown her business, which earned revenues of $33 million last year. It now has 83 employees at its 930-square-meter facility in New York City.


Adafruit Industries got its start after Fried posted her own DIY projects on her website. People contacted her about how they could build their own gadgets. “It was something for me to kill time with until I got a real job in industry,” she says.

Soon she found herself flooded with orders and making daily trips to the post office between classes to mail the kits. She finally felt she had no choice but to read up on the nuts and bolts of running a company, especially the financial parts. “The idea of crunching numbers didn’t scare me. I’m an engineer,” she recalls. “I thought, accounting is so easy compared to differential calculus.”

For Fried, it was about learning one aspect of the business at a time. When she hired her first employee, she learned to process a payroll. When it came time to file taxes, she determined which forms she needed.

Fried says she wouldn’t have been so successful, however, if she didn’t offer well-designed goods. “When you have poor-quality products, you need to deal with a lot of technical support, returns, and repairs,” she says.

In fact, she tests every product the company sells. “If I don’t think a product is good enough,” she says, “then I don’t put it in on our website.” And her attention to detail shows. Inc. magazine last year ranked Adafruit 11th among the fastest-growing manufacturers in the United States and the top one in the New York City metropolitan area from 2011 to 2014.

Fried was also the first woman engineer to be on the cover of Wired magazine, and she has been named a top entrepreneur by several leading business publications.

Adafruit offers open-source code to help people design their projects. “Customers are encouraged to hack into our products to make them their own,” Fried says. They could, for example, use the open-source code on GitHub, a platform that lets users access—and, if they wish, collaborate on—coding projects in order to add functions to their devices.

Fried has been influential in the open-source hardware community. She participated in the first Open Source Hardware Summit and the drafting of the definition of open-source hardware.


Fried doesn’t rely on typical market research to learn what her customers want. Instead, she gives away products in exchange for feedback. “Let customers teach you about what you’re making,” she advises.

Adafruit hosts a weekly “Show and Tell” Google Hangout session in which members of the DIY electronics community showcase the projects they’re working on. That provides the company with insight into not only how people use its products but also what they might want in new ones.

Fried has also made STEM education part of the company’s mission. To help teachers instruct preuniversity students about electronics and programming, she created a DIY kit for building a smartphone.

And Adafruit produces the YouTube series Circuit Playground, meant for children. Each episode covers an electrical engineering concept. For example, one episode is titled “F Is for Frequency.” Fried says she wants electronics to be “just as enjoyable for kids as watching their favorite movies.”

This article originally appeared in print as “The Do-It-Yourself Entrepreneur.”

Drawing in the third dimension

Why are we still drawing like we were during the Renaissance?
Why are we still drawing like we were during the Renaissance?

Imagine you could reach inside your old Batman comic, grab the Caped Crusader by the shoulder, and spin the whole scene around to get a new 3-D view.

A new software platform from small business Mental Canvas may soon let you do just that.

The technology allows you to draw like you would with pen and paper, except when you put the pen down, the sketch is viewable from multiple directions–like having access to every camera angle. (Watch this video to get a real sense of it.)

“I think of it as a spatial drawing,” says Julie Dorsey, computer scientist at Yale and founder of Mental Canvas, a tech startup funded by the National Science Foundation (NSF). “Fundamentally, the technology expands on what we think of as a conventional drawing or sketch.”

The tool could someday enable scientists to create 3-D versions of their back-of-the-envelope molecular structures. Architecture students can gain new perspectives on their building designs. Filmmakers can find new ways to visualize storyboards.

Leonardo wishes he had this

The technology started as passion project by Dorsey nearly 10 years ago.

There is a disconnect, she thought, between sketching in 2-D, which is fast and fluid, and modeling in 3-D, which is slow and requires precise geometry.

Dorsey studied architecture as an undergraduate student. She was drawn to computer graphics and graduate work in computer science as a way to expand her interests in designing, rendering and exploring human-made spaces, also known as “built” scenes.

“The built world poses an amazing range of difficult problems: from modeling stone weathering on a Gothic cathedral to simulating lighting effects in a city at night,” she said.

But while we’re surrounded by amazing devices–from smart watches to smart glasses–we’re still drawing like it’s the Renaissance.

“Sketching today isn’t that different than what Leonardo [da Vinci] was doing, even with a tablet,” Dorsey says. “Today’s digital illustration packages merely simulate drawing on paper. They don’t accelerate the sketching process or enhance a sketch’s value as an ideation or communication tool.”

Dorsey began to develop her technology in 2007 with support from a small NSF exploratory research grant. Prior to that, she had explored novel 3-D digital environments for years, with NSF’s support.

Building the platform meant developing a new media type, as well as a set of tools to design and interact with this media form. Dorsey’s challenge was to find a way to explore complex geometric forms through drawing. Her technology incorporates elements of computer-aided design, 2-D and 3-D graphics, software engineering, and human-computer interaction.

“Most people think IT innovations happen in a garage overnight,” says Peter Atherton, NSF program officer who oversees Mental Canvas’ Small Business Innovation Research grant. “But it actually most often takes years of hard work and support.”

Honing the product

Consumer technologies–even really cool ones–are only successful if consumers actually use them.

In 2012, Dorsey participated in the NSF Innovation Corps program, which trains scientists and engineers to think more like businesspeople in order to get their inventions out of the lab and into the marketplace. She and her team learned how to hone their pitch and product for prospective customers.

Including, for instance, how to make her software stand out among other products already on the market.

“In other systems, you can create drawings or paintings on individual layers and those layers slide around in a single plane,” she says. “Our technology involves drawing in space, so the underlying representation is 3-D rather than 2-D, and it’s really fast and fluid.”

As a first demonstration of the capabilities of the software and new media type, Mental Canvas has applied the technology to an illustrated book called “The Other Side” by Istvan Banyai.

The book is a modern-day graphic novel, taking the reader on a complex, visual journey. With Dorsey’s technology rendering the story in 3-D, the effect is immersive. See for yourself. (The company launched it today.)

Doodling in 3-D

The company plans to make the drawing software itself available to consumers in the coming year, and to follow shortly thereafter with versions for specific markets, such as storyboarding and industrial design, that require more computing power and functionality.

The NSF SBIR grant is intended to provide seed funding as the company gains commercial footing, and feeds back into the nation’s innovation ecosystem.

“Imagine picture books and graphic novels–and illustrations in general–in the future aren’t flat but have active 3-D qualities to move around the story,” Dorsey says.
— Sarah Bates, NSF (703) 292-7738

Julie Dorsey

Related Institutions/Organizations
Yale University
Mental Canvas, LLC
Massachusetts Institute of Technology

Related Programs
Small business

Related Awards
#0839963 SGER: Exploratory Research in Sketch-Based Modeling
#1018470 HCC: Small: Sketching Architectural Designs in Context
#0841534 Exploratory Research in Creative Sketching for 3D Design
#1315663 SBIR Phase I: In-Situ Sketching: Drawing on the Real World
#1044030 EAGER: Exploratory Research in Creative Sketching for 3D Design
#0738472 Mental Canvas: Exploring the Middle Ground Between Sketch and Object
#1248846 I-Corps: The Mental Canvas: A New Approach to Exploring Ideas in 3D
#9988535 Computer Graphics Techniques for Modeling and Rendering Weathered Materials
#1218515 CGV: Small: High-Fidelity Representation of Architectural Material Appearance
#9802220 Computer Graphics for Designing, Capturing, Simulating, and Exploring 3D Environments
#9624172 CAREER: Computer Graphics Techniques for Modeling and Rendering Weathered Materials

Total Grants

Related Websites
Mental Canvas:
See a 2D sketch become 3D:
Mental Canvas’ Julie Dorsey walks us through her company’s tech:

Drawing of girl and cat
The research behind this technology started as passion project nearly 10 years ago.
Credit and Larger Version

Person looking at moving illustration
Building the platform meant developing a new media type, as well as a set of tools.
Credit and Larger Version

Person looking at screen
View Video
Building the platform meant developing a new media type.
Credit and Larger Version

Architectural design sketch
View Video
Watch a sketch break out of the 2-D plane and appear 3-D.
Credit and Larger Version


New optical chip lights up the race for quantum computer

The microprocessor inside a computer is a single multipurpose chip that has revolutionised people’s life, allowing them to use one machine to surf the web, check emails and keep track of finances. Now, researchers from the University of Bristol in the UK and Nippon Telegraph and Telephone (NTT) in Japan, have pulled off the same feat for light in the quantum world by developing an optical chip that can process photons in an infinite number of ways.

It’s a major step forward in creating a quantum computer to solve problems such as designing new drugs, superfast database searches, and performing otherwise intractable mathematics that aren’t possible for super computers.

The fully reprogrammable chip brings together a multitude of existing quantum experiments and can realise a plethora of future protocols that have not even been conceived yet, marking a new era of research for quantum scientists and engineers at the cutting edge of quantum technologies. The work is published today [14 August] in the journal Science.

Since before Newton held a prism to a ray of sunlight and saw a spectrum of colour, scientists have understood nature through the behaviour of light. In the modern age of research, scientists are striving to understand nature at the quantum level and to engineer and control quantum states of light and matter.

A major barrier in testing new theories for quantum science and quantum computing is the time and resources needed to build new experiments, which are typically extremely demanding due to the notoriously fragile nature of quantum systems.

This result shows a step change for experiments with photons, and what the future looks like for quantum technologies.

Dr Anthony Laing, who led the project, said: “A whole field of research has essentially been put onto a single optical chip that is easily controlled. The implications of the work go beyond the huge resource savings. Now anybody can run their own experiments with photons, much like they operate any other piece of software on a computer. They no longer need to convince a physicist to devote many months of their life to painstakingly build and conduct a new experiment.”

The team demonstrated the chip’s unique capabilities by re-programming it to rapidly perform a number of different experiments, each of which would previously have taken many months to build.

Bristol PhD student Jacques Carolan, one of the researchers, added: “Once we wrote the code for each circuit, it took seconds to re-programme the chip, and milliseconds for the chip to switch to the new experiment. We carried out a year’s worth of experiments in a matter of hours. What we’re really excited about is using these chips to discover new science that we haven’t even thought of yet.”

The device was made possible because the world’s leading quantum photonics group teamed up with Nippon Telegraph and Telephone (NTT), the world’s leading telecommunications company.

Professor Jeremy O’Brien, Director of the Centre for Quantum Photonics at Bristol University, explained: “Over the last decade, we have established an ecosystem for photonic quantum technologies, allowing the best minds in quantum information science to hook up with established research and engineering expertise in the telecommunications industry. It’s a model that we need to encourage if we are to realise our vision for a quantum computer.”

The University of Bristol’s pioneering ‘Quantum in the Cloud’ is the first and only service to make a quantum processor publicly accessible and plans to add more chips like this one to the service so others can discover the quantum world for themselves.

‘Universal Linear Optics’ by J Carolan et al in Science.


Intel’s Reinvention of the Hard Drive Could Make All Kinds of Computers Faster

A new kind of hard drive available next year will be able to move your data many times faster than the best today.

Computers from laptops to supercomputers could get a major speed boost next year, thanks to a new kind of hard drive developed by Intel. Intel Optane drives, as they will be called, are based on a new way to store digital data that can operate as much as 1,000 times as fast as the flash memory technology inside hard drives, memory sticks, and mobile devices today.

The first Optane drives won’t be that much faster than today’s data storage. An early prototype shown by Intel at its annual developer conference in San Francisco on Tuesday was only about seven times as fast as a top-of-the-range flash disk drive available today. However, even that level of performance could have significant effects on the capabilities of consumer and corporate computers, and Optane drives may perform better by the time they hit the market in 2016.

The sluggish speed of data storage compared to the pace at which processors can work on data has become a significant bottleneck on the capabilities of computers. Several large computing and chip companies have invested heavily in promising new data storage technologies, but none has yet borne fruit. Intel’s Optane drives are based on a technology called 3D Xpoint, developed in collaboration with the memory chip company Micron.

Intel says the technology is affordable enough that Optane drives will be made available next year for uses ranging from large corporate data centers to lightweight laptops. Rob Crooke, a general manager on Intel’s memory project, predicted that they would improve gaming, supercomputers, and data analysis. “We expect to see breakthroughs in personalized medicine, in business analytics to allow companies, cities, and maybe countries to run more efficiently,” he said.

The flash memory chips that are the fastest way to store data today use a grid of clumps of electrons trapped on silicon to represent the 0s and 1s of digital data. A 3D Xpoint chip instead has a grid formed from metal wires layered over one another; data is stored by using electricity to change the arrangement of atoms inside material trapped at each junction of the grid. Just like flash, 3D Xpoint chips hold onto data even when powered down. They can’t currently store data as densely, but Intel says the Xpoint grids can be stacked vertically, providing a route to storing more data on one chip.

Intel hasn’t released much more detail about 3D Xpoint, but its basic design is similar to what’s at the heart of an ambitious project by Hewlett-Packard to use devices called memristors to create faster data storage and new computer designs. Other large companies as well as startups are working on similar technology . However, progress has been slower than anticipated and Intel is the only company promising complete hard drives on the market next year. After difficulties with its own memory technology, HP recently scaled back its memristor plans.


IBM’s ‘Rodent Brain’ Chip Could Make Our Phones Hyper-Smart

At a lab near San Jose, IBM has built the digital equivalent of a rodent brain---roughly speaking. It spans 48 of the company's experimental TrueNorth chips, a new breed of processor that mimics the brain's biological building blocks.
At a lab near San Jose, IBM has built the digital equivalent of a rodent brain—roughly speaking. It spans 48 of the company’s experimental TrueNorth chips, a new breed of processor that mimics the brain’s biological building blocks.

Dharmendra Modha walks me to the front of the room so I can see it up close. About the size of a bathroom medicine cabinet, it rests on a table against the wall, and thanks to the translucent plastic on the outside, I can see the computer chips and the circuit boards and the multi-colored lights on the inside. It looks like a prop from a ’70s sci-fi movie, but Modha describes it differently. “You’re looking at a small rodent,” he says.

He means the brain of a small rodent—or, at least, the digital equivalent. The chips on the inside are designed to behave like neurons—the basic building blocks of biological brains. Modha says the system in front of us spans 48 million of these artificial nerve cells, roughly the number of neurons packed into the head of a rodent.

Modha oversees the cognitive computing group at IBM, the company that created these “neuromorphic” chips. For the first time, he and his team are sharing their unusual creations with the outside world, running a three-week “boot camp” for academics and government researchers at an IBM R&D lab on the far side of Silicon Valley. Plugging their laptops into the digital rodent brain at the front of the room, this eclectic group of computer scientists is exploring the particulars of IBM’s architecture and beginning to build software for the chip dubbed TrueNorth.

Some researchers who got their hands on the chip at an engineering workshop in Colorado the previous month have already fashioned software that can identify images, recognize spoken words, and understand natural language. Basically, they’re using the chip to run “deep learning” algorithms, the same algorithms that drive the internet’s latest AI services, including the face recognition on Facebook and the instant language translation on Microsoft’s Skype. But the promise is that IBM’s chip can run these algorithms in smaller spaces with considerably less electrical power, letting us shoehorn more AI onto phones and other tiny devices, including hearing aids and, well, wristwatches.

“What does a neuro-synaptic architecture give us? It lets us do things like image classification at a very, very low power consumption,” says Brian Van Essen, a computer scientist at the Lawrence Livermore National Laboratory who’s exploring how deep learning could be applied to national security. “It lets us tackle new problems in new environments.”

The TrueNorth is part of a widespread movement to refine the hardware that drives deep learning and other AI services. Companies like Google and Facebook and Microsoft are now running their algorithms on machines backed with GPUs (chips originally built to render computer graphics), and they’re moving towards FPGAs (chips you can program for particular tasks). For Peter Diehl, a PhD student in the cortical computation group at ETH Zurich and University Zurich, TrueNorth outperforms GPUs and FPGAs in certain situations because it consumes so little power.

The main difference, says Jason Mars, a professor of a computer science at the University of Michigan, is that the TrueNorth dovetails so well with deep-learning algorithms. These algorithms mimic neural networks in much the same way IBM’s chips do, recreating the neurons and synapses in the brain. One maps well onto the other. “The chip gives you a highly efficient way of executing neural networks,” says Mars, who declined an invitation to this month’s boot camp but has closely followed the progress of the chip.


When it comes to learning programming, some things have changed — but not everything

In 1950, fifty-one people attended the Summer School on Programme Design for Automatic Digital Computing Machines at Cambridge University. Over the previous decade, engineering and mathematical researchers had developed the first stored-program computers, and figured out how to operate them as they went; the students who came to Cambridge that summer were the first to sign up to specifically learn the art on Cambridge’s EDSAC computer.

The students that attended were a varied lot, with a varying goals — one was actually a salesman for Ferranti, the company that was going to release the first commercial computer the next year, and he spent more time chatting up potential customers than learning how to program. The physical experience of programming was radically different from how we’d understand it today, as this remarkable film illustrates. Still, they are the predecessors of every young person on summer vacation today who’s waiting for college — and their education in computer science — to start.

With tech changing so quickly, many aspects of these young people’s education will be different from those on a similar track just a decade or so ago — different, in other words, from those who will be their peers and co-workers once they reach the job market. If you’re a long-time IT professional, some of what you did in school will look as outdated to these new students as those keypunch EDSAC programming techniques look to you. It’s instructive to see what has — and hasn’t — changed over the decades.

What you have to work with

Rob Pierce has enjoyed a smorgasbord of decades’ worth of computer education: he took an introduction to computer concepts in the mid-1980s, an introduction to programming course in the 1990s, and is taking a data structures and object-oriented programming class today. One of his observations is basic, but might not occur to someone entering the field today: “One big change is the expectation that everyone has their own computer.” The days when computers were room-sized devices like EDSAC lingered on for decades after computer science became a regular part of college curricula.

Nancie K. began her computer science undergraduate life in 1981, just as computers really started getting personal. She entered a world in transition — and one where rank had its privileges. “When I started, in ’81, the university had around 12 terminals available for the CS department, hooked to the single mainframe the university owned. Only seniors and grad students were allowed to use them; everyone else had to use punch cards.” Even getting access to one of those terminals wasn’t the golden ticket, though: “They were connected to a mainframe the university leased space on, owned by one of the major banks in Florida. Of course, the university jobs had the lowest priority on the computer. I once waited 45 minutes just for a logon prompt — after that I continued to use cards until my final semester.”

But Nancie had an ace up her sleeve: a TRS-80 Model III of her own. Many of her classes were taught using assembly, Pascal, or COBOL, but some were language-agnostic, and for those “I used Fortran, because I had a Fortran compiler for my TRS-80 Model III.” She ended up pushing her hardware to the limit: “the TRS had 48 KB memory, and on at least one occasion I had two versions of my homework. One had comments and was nicely formatted, the other had all of that removed so that the TRS had enough memory to actually compile and run it.”

Just five years later, when Pierce took his first class in 1986, his hardware environment was quite different: a lab full of Apple IIs. If you need a sense of how computer education, and indeed the whole industry, transformed in a very brief period, imagine the leap from using punchcards to microcomputers in half a decade. By contrast, Dr. Nick Carlson, a civil engineer and an instructor at New Jersey community colleges, considers the work environment he oversees today — “a lab full of networked desktops running Windows” — as being essentially the same as what he would’ve used when he took his first programming class more than fifteen years earlier.
How far do you drill down?

When I was taking computer classes in high school in the late 1980s, we discussed transistors and logic gates, not that I really remember much of it or ever fully grasped how it related to programming a computer. Still, I wondered if that was something that anyone in school today would still be expected to understand at an introductory level. Nancie K. may have been working on assembly language code on in the early 1980s, but in Rob Pierce’s experience, modern-day classes are quite different. “C/C++ has been replaced by higher-level VB and Java,” he says, “and ‘while’ and ‘for’ loops are taught long before stacks and pointers.”

“I have an uncle who worked for IBM back in the 1960s and he used to program with a soldering iron, so he says. He did programming in assembly. I’ve never had a class in assembly.” In fact, says Pierce, “none of the intro classes teach assembly language. At the junior level there’s Computational Structures, which goes into the theory and math behind binary logic and arithmetic, graphing, shortest-route, and so on. Another class goes into the hardware side of this, with logic gates and some assembly programming.”

It seems that while computer science once took a view that to learn the discipline you had to trace it to its beginnings on practically bare metal, today that’s considered an advanced-level branch of study. You don’t need it for the basics.
But what’s it all for?

Beyond the nuts and bolts of what specifically you’d study and what machines you’d use to study it on, there’s a bigger question looming over the field: why would you bothering studying the subject at all?

My impression is that the level of practicality involved waxed and waned over the years.

In the ’80s, when Nancie K. was taking classes, for instance, she says that “the focus was definitely on number-crunching and databases on mainframes for major systems. I had vague notions that I’d be writing programs to track and/or analyze numbers, probably for a government contractor. As it happens, that’s about what I was doing my first five years after graduation. And minus the government contractor, it’s what I’m still doing.”

This was an age when for some years, the field had been — not stagnant, necessarily, but mature. Computers as data processors were well understood by businesses, and were very lucrative for both companies that sold them and organizations that used them. But a revolution was brewing. “Things actually had started to shift by ’84 — there were a few computer graphics classes in the catalog by my last semester, as well as some hardware-focused microcomputer/PC classes. The Last Starfighter came out around that time, so people were starting to see the creative things computers could do.”

By the turn of the century, the classes Dr. Carlson took in high school and college had a very different feel. “Both programming classes I took, especially the intro to programming for engineers class in college, were very focused on just teaching the isolated mechanics of the language — this is a function, this is an if statement, this is a loop, etc. — without any real consideration to applications besides whatever contrived problems were on the homework. They weren’t taught with the idea that you’d eventually want to try to put these pieces together into a bigger whole that would actually do something relevant or useful.”

He sees this too as a reflection of the times — there were kids who “enjoyed computers for the sake of tinkering with a computer,” as the PC revolution had brought a wave of machines into homes but hadn’t quite cooled off to the point that they had become dull appliances. “I feel like whoever was designing the curriculum was vaguely aware that people taking the class might want to do ‘stuff with computers’ in the future and didn’t feel the need to try to tie it to any other discipline.”

The class Dr. Carlson teaches now is called “programming for engineers,” and is much more aimed at practical use. The language it’s based on is MATLAB, “a numerical programming language that’s fairly popular with academia and engineers.” In his class, “the syllabus for the programming for engineers class is extremely focused on how to apply programming as a tool, starting with the typical problems — things like ‘how do I analyze this instrumentation data?’, ‘how do I simulate this physical process?’, ‘how do I automate this repetitive calculation?’ — and showing how the language fits into the solution, introducing the new concepts they’d need to solve it. Writing the code is just one step of the whole solution, since the students need to understand the physical basis of the code they’re writing and correctly interpret the output it produces. The class is also set up to show students the limitations of computers, like floating-point accuracy or the concept of garbage-in, garbage-out, which is definitely relevant to them since they’ll be using software for design even if they never write their own code again.”

In this way, the students of today aren’t that different from the fifty-one people who attended the Summer School on Programme Design for Automatic Digital Computing Machines at Cambridge University in 1950. Those students didn’t all necessarily want to become computer scientists — it wasn’t particularly clear that “computer science” was it’s own thing, yet. Many were chemists and mathematicians who were just excited about the practical ways new technology could make their existing jobs easier.
A true vocation

In fact, the practical needs of both students and employers have given rise to a whole category of computer science education under the aegis of schools that aren’t colleges at all. These “code schools” are aimed at eschewing theory and giving students practical skills in a short amount of time. As Christopher Mims put it in the Wall Street Journal, “we’ve entered an age in which demanding that every programmer has a degree is like asking every bricklayer to have a background in architectural engineering.”

And indeed, employers are looking for a focus on the practical as well. It’s not that the skills colleges teach are obsolete — but students seem to need extra help once they hit the industry. Facebook puts its new hires through an intensive programming course when they arrive, for instance.

Dave Parker, CEO of Code Fellows, a software programming trade school in Seattle, says that “there aren’t a lot of IT jobs left managing hardware or server farms, some of those old degree programs need to recognize that and move on.” And the software stacks that the programmers use in day-to-day life change more quickly than college curricula can accommodate, says Parker.

While MOOCs like Udacity have made sweeping claims that they’ll replace universities, Parker doesn’t see CS degree programs going away anytime soon. But courses like his company offers are a useful supplement. “Most employers still want CS grads with five years (real world) experience. We get to deliver ‘life experience’ candidates with fresh technical skills on a stack that is in high demand today. It doesn’t replace a CS degree, but depending where you are in your career it’s a great alternative.”

Perhaps the best way of thinking about the future of CS education is that even if you’ve got a degree, you’re going to keep needing an education.

University technology program launched to give peace a chance

Computer science and engineering students at Drexel University will have a new opportunity to use their skills for good with the launch of a technology program for promoting world peace.

The Young Engineers Program, a partnership between the private university in Philadelphia and PeaceTech Lab in Washington, D.C., will give computer science and engineering students and researchers at Drexel a chance to focus on conflict zones around the world.

The goal of the program is to use technology, media and data to prevent violent conflict in hot spots like Afghanistan, Sudan and Colombia, said Sheldon Himelfarb, president and CEO of the PeaceTech Lab.

Social media and data tools are “really changing the nature of international conflict resolution and peace building,” Himelfarb said.

The new program, with peace engineering classes starting at Drexel this fall, will give students an opportunity to bring fresh ideas to conflict resolution, Himelfarb said. Most of the students involved will be digital natives, and they can bring innovative approaches to technology that other generations may not, he said.

“We’ve really learned that having the eyes and ears of this younger generation of technologists and engineers is crucial,” he said.

Social media and data crunching technologies can help track violent outbreaks in hot spots, and the lab is looking for new ideas on how to use those tools, Himelfarb said. In many conflict zones, mobile phones are pervasive, and mobile apps can help with conflict resolution efforts, he said.

In Afghanistan, for example, the literacy rate is about 35 percent, but about 70 percent of people own mobile phones, he said.

Drexel students Tristan Leung and Dagmawi Mulugeta, both sophomores majoring in computer engineering, said they’re looking forward to the PeaceTech Lab classes. The opportunity for students to work in a cooperative education setting at PeaceTech Lab in Washington, starting in the spring of 2016, will allow students to use their classroom education in a real-world setting, Leung said.

In addition, the program will “help better the world,” Leung said.

For Mulugeta, the program offers an opportunity for him to be of service to his native Ethiopia. “One of the reasons I joined Drexel is because it was so diverse and so international,” he said. “I would like to give back to my home country, to the people I grew up with in some way.”

The new program will deliver most of its classes online, giving people outside the university’s main campuses a chance to participate, said Joseph Hughes, dean of the Drexel College of Engineering. For on-campus students, the coop element of the program will allow them to “get their professional experience before graduation,” he said.

In the future, the program may allow Drexel students to participate in international coop programs as well as work at PeaceTech Lab.

The new program also aligns closely with Drexel’s vision of being the “most civically engaged university in the country,” Hughes added. “The conflict that arises in communities, regions or countries can be mitigated, to a certain degree, through civil engagement.”

The PeaceTech Lab, launched in 2014 as an off-shoot of the government-funded U.S. Institute of Peace, brings together engineers, activists, social scientists, data scientists and other experts to develop tech, social media and data tools focused on preventing violent conflict.


Stanford team’s brain-controlled prosthesis nearly as good as one-finger typing

Years of work have yielded a technique that continuously corrects brain readings to give people with spinal cord injuries a more precise way to tap out commands by using a thought-controlled cursor. A pilot clinical trial for human use is underway.

When we type or perform other precise tasks, our brains and muscles usually work together effortlessly.

But when a neurological disease or spinal cord injury severs the connection between the brain and limbs, once-easy motions become difficult or impossible.

In recent years researchers have sought to give people suffering from injury or disease some restored motor function by developing thought-controlled prostheses.

Such devices tap into the relevant regions of the brain, bypass damaged connections and deliver thought commands to devices such as virtual keypads.

But brains are complex. Actions and thoughts are orchestrated by millions of neurons – biological switches that fire faster or slower in dynamic patterns.

Brain-controlled prostheses currently work with access to a sample of only a few hundred neurons, but need to estimate motor commands that involve millions of neurons. So tiny errors in the sample – neurons that fire too fast or too slow – reduce the precision and speed of thought-controlled keypads.

Now an interdisciplinary team led by Stanford electrical engineer Krishna Shenoy has developed a technique to make brain-controlled prostheses more precise. In essence the prostheses analyze the neuron sample and make dozens of corrective adjustments to the estimate of the brain’s electrical pattern – all in the blink of an eye.

Shenoy’s team tested a brain-controlled cursor meant to operate a virtual keyboard. The system is intended for people with paralysis and amyotrophic lateral sclerosis (ALS), also called Lou Gehrig’s disease. ALS degrades one’s ability to move. The thought-controlled keypad would allow a person with paralysis or ALS to run an electronic wheelchair and use a computer or tablet.

“Brain-controlled prostheses will lead to a substantial improvement in quality of life,” Shenoy said. “The speed and accuracy demonstrated in this prosthesis results from years of basic neuroscience research and from combining these scientific discoveries with the principled design of mathematical control algorithms.”
Brain dynamics

The new corrective technique is based on a recently discovered understanding of how monkeys naturally perform arm movements. The researchers studied animals that were normal in every way. The monkeys used their arms, hands and fingers to reach for targets presented on a video screen. What the researchers sought to learn through hundreds of experiments was what the electrical patterns from the 100- to 200-neuron sample looked like during a normal reach. In short, they came to understand the “brain dynamics” underlying reaching arm movements.

“These brain dynamics are analogous to rules that characterize the interactions of the millions of neurons that control motions,” said Jonathan Kao, a doctoral student in electrical engineering and first author of the Nature Communications paper on the research. “They enable us to use a tiny sample more precisely.”

In their current experiments Shenoy’s team members distilled their understanding of brain dynamics into an algorithm that could analyze the measured electrical signals that their prosthetic device obtained from the sampled neurons. The algorithm tweaked these measured signals so that the sample’s dynamics were more like the baseline brain dynamics. The goal was to make the thought-controlled prosthetic more precise.

To test this algorithm the Stanford researchers trained two monkeys to choose targets on a simplified keypad. The keypad consisted of several rows and columns of blank circles. When a light flashed on a given circle the monkeys were trained to reach for that circle with their arms.

To set a performance baseline the researchers measured how many targets the monkeys could tap with their fingers in 30 seconds. The monkeys averaged 29 correct finger taps in 30 seconds.

The real experiment only scored virtual taps that came from the monkeys’ brain-controlled cursor. Although the monkey may still have moved his fingers, the researchers only counted a hit when the brain-controlled cursor, corrected by the algorithm, sent the virtual cursor to the target.

The prosthetic scored 26 thought-taps in 30 seconds, about 90 percent as quickly as a monkey’s finger. (See video of hand- versus thought-controlled cursor taps.)

Thought-controlled keypads are not unique to Shenoy’s lab. Other brain-controlled prosthetics use different techniques to solve the problem of sampling error. Of several alternative techniques tested by the Stanford team, the closest resulted in 23 targets in 30 seconds.
Next steps

The goal of all this research is to get thought-controlled prosthetics to people with ALS. Today these people may use an eye-tracking system to direct cursors or a “head mouse” that tracks the movement of the head. Both are fatiguing to use. Neither provides the natural and intuitive control of readings taken directly from the brain.

The U.S. Food and Drug Administration recently gave Shenoy’s team the green light to conduct a pilot clinical trial of their thought-controlled cursor on people with spinal cord injuries.

“This is a fundamentally new approach that can be further refined and optimized to give brain-controlled prostheses greater performance, and therefore greater clinical viability,” Shenoy said.

Paul Nuyujukian, a postdoctoral researcher and MD/PhD in neurosurgery and electrical engineering, also contributed to the research, as did Stephen Ryu, a neurosurgeon with the Palo Alto Medical Foundation and consulting professor of electrical engineering. Columbia University assistant professors Mark Churchland, in the Neuroscience Department, and John Cunningham, in the Statistics Department, completed the team roster.

Funding for the experiments came from a Director’s Pioneer Award from the National Institutes of Health, a T-RO1 Award from the National Institutes of Health and two programs from the Defense Advanced Research Projects Agency: REPAIR (Reorganization and Plasticity to Accelerate Injury Recovery) and Neuro-FAST (Neuro Function, Activity, Structure, and Technology).
Media Contact

Tom Abate, Associate Director of Communications for Stanford Engineering, (650) 736-2245,


Researchers Create First Firmware Worm That Attacks Macs

When it comes to PCs and Apple computers is that the latter are much more secure. Particularly when it comes to firmware, people have assumed that Apple systems are locked down in ways that PCs aren’t.

It turns out this isn’t true. Two researchers have found that several known vulnerabilities affecting the firmware of all the top PC makers can also hit the firmware of MACs. What’s more, the researchers have designed a proof-of-concept worm for the first time that would allow a firmware attack to spread automatically from MacBook to MacBook, without the need for them to be networked.

The attack raises the stakes considerably for system defenders since it would allow someone to remotely target machines—including air-gapped ones—in a way that wouldn’t be detected by security scanners and would give an attacker a persistent foothold on a system even through firmware and operating system updates. Firmware updates require the assistance of a machine’s existing firmware to install, so any malware in the firmware could block new updates from being installed or simply write itself to a new update as it’s installed.

The only way to eliminate malware embedded in a computer’s main firmware would be to re-flash the chip that contains the firmware.

“[The attack is] really hard to detect, it’s really hard to get rid of, and it’s really hard to protect against something that’s running inside the firmware,” says Xeno Kovah, one of the researchers who designed the worm. “For most users that’s really a throw-your-machine-away kind of situation. Most people and organizations don’t have the wherewithal to physically open up their machine and electrically reprogram the chip.”

It’s the kind of attack intelligence agencies like the NSA covet. In fact, documents released by Edward Snowden, and research conducted by Kaspersky Lab, have shown that the NSA has already developed sophisticated techniques for hacking firmware.

The Mac firmware research was conducted by Kovah, owner of LegbaCore, a firmware security consultancy, and Trammell Hudson, a security engineer with Two Sigma Investments. They’ll be discussing their findings on August 6 at the Black Hat security conference in Las Vegas.

A computer’s core firmware—also referred to at times as the BIOS, UEFI or EFI—is the software that boots a computer and launches its operating system. It can be infected with malware because most hardware makers don’t cryptographically sign the firmware embedded in their systems, or their firmware updates, and don’t include any authentication functions that would prevent any but legitimate signed firmware from being installed.

Firmware is a particularly valuable place to hide malware on a machine because it operates at a level below the level where antivirus and other security products operate and therefore does not generally get scanned by these products, leaving malware that infects the firmware unmolested. There’s also no easy way for users to manually examine the firmware themselves to determine if it’s been altered. And because firmware remains untouched if the operating system is wiped and re-installed, malware infecting the firmware can maintain a persistent hold on a system throughout attempts to disinfect the computer. If a victim, thinking his or her computer is infected, wipes the computer’s operating system and reinstalls it to eliminate malicious code, the malicious firmware code will remain intact.

5 Firmware Vulnerabilities in Macs
Last year, Kovah and his partner at Legbacore, Corey Kallenberg, uncovered a series of firmware vulnerabilities that affected 80 percent of PCs they examined, including ones from Dell, Lenovo, Samsung and HP. Although hardware makers implement some protections to make it difficult for someone to modify their firmware, the vulnerabilities the researchers found allowed them to bypass these and reflash the BIOS to plant malicious code in it.

Kovah, along with Hudson, then decided to see if the same vulnerabilities applied to Apple firmware and found that untrusted code could indeed be written to the MacBook boot flash firmware. “It turns out almost all of the attacks we found on PCs are also applicable to Macs,” says Kovah.

They looked at six vulnerabilities and found that five of them affected Mac firmware. The vulnerabilities are applicable to so many PCs and Macs because hardware makers tend to all use some of the same firmware code.

“Most of these firmwares are built from the same reference implementations, so when someone finds a bug in one that affects Lenovo laptops, there’s a really good chance it’s going to affect the Dells and HPs,” says Kovah. “What we also found is that there is really a high likelihood that the vulnerability will also affect Macbooks. Because Apple is using a similar EFI firmware.”

In the case of at least one vulnerability, there were specific protections that Apple could have implemented to prevent someone from updating the Mac code but didn’t.

“People hear about attacks on PCs and they assume that Apple firmware is better,” Kovah says. “So we’re trying to make it clear that any time you hear about EFI firmware attacks, it’s pretty much all x86 [computers].”

They notified Apple of the vulnerabilities, and the company has already fully patched one and partially patched another. But three of the vulnerabilities remain unpatched.

Thunderstrike 2: Stealth Firmware Worm for Macs
Using these vulnerabilities, the researchers then designed a worm they dubbed Thunderstrike 2 that can spread between MacBooks undetected. It can remain hidden because it never touches the computer’s operating system or file system. “It only ever lives in firmware, and consequently no [scanners] are actually looking at that level,” says Kovah.

The attack infects the firmware in just seconds and can also be done remotely.

There have been examples of firmware worms in the past—but they spread between things like home office routers and also involved infecting the Linux operating system on the routers. Thunderstrike 2, however, is designed to spread by infecting what’s known as the option ROM on peripheral devices.

An attacker could first remotely compromise the boot flash firmware on a MacBook by delivering the attack code via a phishing email and malicious web site. That malware would then be on the lookout for any peripherals connected to the computer that contain option ROM, such as an Apple Thunderbolt Ethernet adapter, and infect the firmware on those. The worm would then spread to any other computer to which the adapter gets connected.

When another machine is booted with this worm-infected device inserted, the machine firmware loads the option ROM from the infected device, triggering the worm to initiate a process that writes its malicious code to the boot flash firmware on the machine. If a new device is subsequently plugged into the computer and contains option ROM, the worm will write itself to that device as well and use it to spread.

One way to randomly infect machines would be to sell infected Ethernet adapters on eBay or infect them in a factory.

“People are unaware that these small cheap devices can actually infect their firmware,” says Kovah. “You could get a worm started all around the world that’s spreading very low and slow. If people don’t have awareness that attacks can be happening at this level then they’re going to have their guard down and an attack will be able to completely subvert their system.”

In a demo video Kovah and Hudson showed WIRED, they used an Apple Thunderbolt to Gigabit Ethernet adapter, but an attacker could also infect the option ROM on an external SSD or on a RAID controller.

No security products currently check the option ROM on Ethernet adapters and other devices, so attackers could move their worm between machines without fear of being caught. They plan to release some tools at their talk that will allow users to check the option ROM on their devices, but the tools aren’t able to check the boot flash firmware on machines.

The attack scenario they demonstrated is ideal for targeting air-gapped systems that can’t be infected through network connections.

“Let’s say you’re running a uranium refining centrifuge plant and you don’t have it connected to any networks, but people bring laptops into it and perhaps they share Ethernet adapters or external SSDs to bring data in and out,” Kovah notes. “Those SSDs have option ROMs that could potentially carry this sort of infection. Perhaps because it’s a secure environment they don’t use WiFi, so they have Ethernet adapters. Those adapters also have option ROMs that can carry this malicious firmware.”

He likens it to how Stuxnet spread to Iran’s uranium enrichment plant at Natanz via infected USB sticks. But in that case, the attack relied on zero-day attacks against the Windows operating system to spread. As a result, it left traces in the OS where defenders might be able to find them.

“Stuxnet sat around as a kernel driver on Windows file systems most of the time, so basically it existed in very readily available, forensically-inspectable places that everybody knows how to check. And that was its Achille’s heel,” Kovah says. But malware embedded in firmware would be a different story since firmware inspection is a vicious circle: the firmware itself controls the ability of the OS to see what’s in the firmware, thus a firmware-level worm or malware could hide by intercepting the operating system’s attempts to look for it. Kovah and colleagues showed how firmware malware could lie like this at a talk they gave in 2012. “[The malware] could trap those requests and just serve up clean copies [of code]… or hide in system management mode where the OS isn’t even allowed to look,” he says.

Hardware makers could guard against firmware attacks if they cryptographically signed their firmware and firmware updates and added authentication capabilities to hardware devices to verify these signatures. They could also add a write-protect switch to prevent unauthorized parties from flashing the firmware.

Although these measures would guard against low-level hackers subverting the firmware, well-resourced nation-state attackers could still steal a hardware maker’s master key to sign their malicious code and bypass these protections.

Therefore, an additional countermeasure would involve hardware vendors giving users the ability to easily read their machine’s firmware to determine if it has changed since installation. If vendors provided a checksum of the firmware and firmware updates they distribute, users could periodically check to see if what’s installed on their machine differs from the checksums. A checksum is a cryptographic representation of data that is created by running the data through an algorithm to produce a unique identifier composed of letters and numbers. Each checksum is supposed to be unique so that if anything changes in the dataset, it will produce a different checksum.

But hardware makers aren’t implementing these changes because it would require re-architecting systems, and in the absence of users demanding more security for their firmware, hardware makers aren’t likely to make the changes on their own.

“Some vendors like Dell and Lenovo have been very active in trying to rapidly remove vulnerabilities from their firmware,” Kovah notes. “Most other vendors, including Apple as we are showing here, have not. We use our research to help raise awareness of firmware attacks, and show customers that they need to hold their vendors accountable for better firmware security.”