The July 2022 issue of IEEE Spectrum is here!

Close bar

This Computer Pioneer’s Invention Made Zoom Possible

The service uses Erol Gelenbe’s packet-voice telephone switch

6 min read
Peter Weiner
Peter Weiner

Many IEEE members credit a parent or a teacher for encouraging them to pursue an engineering career, but for Erol Gelenbe it was his next-door neighbor in his hometown of Ankara, Turkey.

Egbert Adriaan Kreiken, a professor of astronomy at Ankara University, convinced the teenager that electronics engineering was going to be the “next big thing.” That was in the late 1950s.

“Kreiken told me studying anything else was absolutely crazy,” Gelenbe recalls. “He started selling the idea to me and even bribed me by promising an internship at Philips, where he had connections. He had emigrated from the Netherlands. He kind of twisted my arm.” Philips was the largest electronics company in the world.

Because electrical engineering uses a lot of mathematics, a subject Gelenbe liked, he figured he wouldn’t be wasting his time by studying EE. He ended up loving the subject and went on to earn a bachelor’s, a master’s, and a doctoral degree, all in EE. And Kreiken did keep his promise about that internship at Philips, where Gelenbe conducted measurements on some early transistor circuits.

Riding the wave of that “next big thing” has paid off for the IEEE Fellow. He is recognized as a pioneer in the field of modeling and performance evaluation of computer systems and networks. He is best known for creating G-networks and random neural networks, probabilistic models inspired by the spiking behavior of neurons. An RNN can offer efficient learning algorithms for recurrent networks such as those used for cybersecurity applications and video compression.

Other technologies he invented can be found in production lines and in the first packet-based video-conferencing systems.

“For my Ph.D., I was interested in computers, but not computers in the sense of practical things,” Gelenbe says. “I was interested in computation—how computers can calculate and also by how the languages that the computers are able to process are transformed by random events that can occur for a variety of reasons. Then I became interested in the foundations of the design of computer systems and networks, and how design choices impact their performance.”

Gelenbe has spent his entire career in academia, and he has established a computer research lab at just about every university where he taught. Currently he is a professor at the Polish Academy of Sciences’ Institute of Theoretical and Applied Informatics, in Gliwice, where he conducts research and supervises Ph.D. students.

To recognize his many contributions, this year the Islamic World Academy of Sciences made Gelenbe an honorary Fellow.


Gelenbe graduated in 1966 with a bachelor’s degree in engineering from Middle East Technical University, in Ankara. He was awarded a Fulbright fellowship to pursue a master’s degree at Polytechnic Institute of Brooklyn (now part of New York University) which, in the 1960s, was considered one of the leading engineering schools. With financial support from a NATO science fellowship, he stayed on at the school to earn a doctorate in electrical engineering in 1970.

His doctoral thesis was on stochastic automata theory, “which was an esoteric subject at that time for the foundation of computers, with impacts on digital systems and formal languages,” he says. “The results in my thesis were about how randomness can actually enhance the power of computation.”

His first teaching job was as an assistant professor of computer, information, and control engineering at the University of Michigan in Ann Arbor. The university had one of the first computer science departments in the United States. One of the two courses he was assigned to teach each semester was on computer systems.

But Gelenbe had a problem. “I knew nothing about computer systems,” he now acknowledges with a laugh. At that time there wasn’t much literature on the field, so he says he devised explanations based on conducting research and writing research papers.

“I barely kept one step ahead of the students, researching and learning at night and teaching in the morning,” he says. “To develop a rational and systematic understanding of the subject so that I could explain it in a sensible way to the students, I was doing research about aspects that I did not understand.

“If you have something which is not really built on theory but just built on practice, you must extract or invent the theoretical elements. Through theoretical methods and math-based methods I became interested in explaining how computer systems and networks work.”

He found out that a few other professors at the time were in the same boat.

“It turned out that I was not the only poor chap or poor woman trying to do that, because some colleagues also had this problem,” he says. “The field was essentially built through the work of a few people like me who were at other universities such as Princeton and Harvard.”

He began contributing groundbreaking research himself while at Michigan. He developed queueing network models of computer systems and networks, through which he was able to establish optimum design choices for diverse issues such as computer communication protocols, optimum checkpointing in databases, and page-replacement problems in virtual memory systems.

He invented G-networks, generalized queueing networks that model both data networks and neural networks. A G-network, also known as a Gelenbe network, is used for queueing systems with specific control functions, such as data packets waiting to be transmitted to devices, like those used for network telephony.

In addition, he worked on improving the performance of multiprogramming computer systems, virtual memory management, and database reliability. He designed and built the first random-access LAN that uses fiber-optic connections, long before Ethernet became the standard.

Gelenbe says he had the most fun inventing the cognitive packet network for real-time adaptive routing in packet networks without routing tables, for which he holds a patent. He holds two patents for the first packet-voice telephone switch, which he designed in the 1990s. The switch is used in Skype and Zoom video-conferencing services. He also created the FlexSim, a computer-based model of a production system used for inventory, assembly, and transportation systems.

Not only was he inventing, he also was busy setting up computer science research labs around the world. They include the Modeling and Performance Evaluation of Computer Systems research group at Inria, the French national research institute for digital science and technology, in Paris. There is also the Laboratory for Computer Science at Université Paris-Saclay; L’école d’Informatique Hesias at Université Paris Cité; and theUniversity of Central Florida’s college of electrical engineering and computer science, in Orlando.

After retiring in 2019 from Imperial College London, Gelenbe joined the Polish Academy of Sciences. There he is researching energy-efficient computer systems, self-aware networks, network security, and networked auctions. He also consults for several companies.

He enjoys academia, he says, because “you’re always surrounded by young people, so you’re constantly in contact with people who are asking questions and don’t take things for granted.”


Gelenbe has been elected Fellow of the French National Academy of Technologies, the Belgian Royal Academy of Sciences, Arts, and Letters, and the science academies of Hungary, Poland, and Turkey. In 2008 ACM awarded him its annual Sigmetrics Achievement Award. ACM said he was “the single individual who, over a span of 30 years, has made the greatest overall contribution to the field of computer system and network performance evaluation.”

France bestowed on him its Order of the Legion of Honor’s chevalier award, and named him a commander of the National Order of Merit for his service to higher education and research in the country. While serving as a science and technology advisor to the French minister for universities from 1984 to 1986, he introduced computer science education to all undergraduate programs.

Italy honored him with a Commander of Merit award and named him a grand officer of the Order of the Star.


Despite Gelenbe’s storied career, he has faced racism along the way.

“I’ve experienced forms of discrimination in every country I’ve been to,” he says. “Sometimes it’s blatant.”

In his first academic job in France, he says, one day he found a note written in French and posted on his office door. It said: “Immigrant worker in his Sunday best.”

“The implication was that I was an immigrant, which is true, and a worker, which is also true. I like to work hard,” he says. “All that is fine but, the ‘Sunday best reference was that I was trying to look better than I should.”

Another example was an article about him that said: “Someone like that should not be a professor here, because most people who come from that same country are working in the coal mines.”

He says he hasn’t overcome the pain such comments have caused him; he’s just developed a thick skin: “I have accepted that it’s ‘normal’ that I will not get equal treatment in certain circumstances. I’ve compensated for this by the fact that I am quite gregarious and have developed connections with people.”


Gelenbe joined IEEE as a student member because, he says, it gave him easy access to its publications—which is harder today because a subscription is needed to the IEEE Xplore Digital Library.

An active volunteer, he was the faculty advisor to Imperial College’s IEEE student branch. From 1979 to 1986 he chaired the IEEE Computer Society’s France Section. He also helped found the society’s International Symposium on Modeling, Analysis, and Simulation of Computer and Telecommunication Systems Conference. Today he is a Distinguished Visitor for the society.

He says IEEE gives people in engineering professions the possibility of a career path.

“It is a way of viewing themselves not just in terms of who is employing them but also in terms of their professional standing and their professional activities, which are going to be outside of their strict job requirements. I would certainly recommend membership to anyone with a career in our fields, whether the person is an academic or not.”

The Conversation (0)

Get unlimited IEEE Spectrum access

Become an IEEE member and get exclusive access to more stories and resources, including our vast article archive and full PDF downloads
Get access to unlimited IEEE Spectrum content
Network with other technology professionals
Establish a professional profile
Create a group to share and collaborate on projects
Discover IEEE events and activities
Join and participate in discussions

The First Million-Transistor Chip: the Engineers’ Story

Intel’s i860 RISC chip was a graphics powerhouse

21 min read
Twenty people crowd into a cubicle, the man in the center seated holding a silicon wafer full of chips

Intel's million-transistor chip development team

In San Francisco on Feb. 27, 1989, Intel Corp., Santa Clara, Calif., startled the world of high technology by presenting the first ever 1-million-transistor microprocessor, which was also the company’s first such chip to use a reduced instruction set.

The number of transistors alone marks a huge leap upward: Intel’s previous microprocessor, the 80386, has only 275,000 of them. But this long-deferred move into the booming market in reduced-instruction-set computing (RISC) was more of a shock, in part because it broke with Intel’s tradition of compatibility with earlier processors—and not least because after three well-guarded years in development the chip came as a complete surprise. Now designated the i860, it entered development in 1986 about the same time as the 80486, the yet-to-be-introduced successor to Intel’s highly regarded 80286 and 80386. The two chips have about the same area and use the same 1-micrometer CMOS technology then under development at the company’s systems production and manufacturing plant in Hillsboro, Ore. But with the i860, then code-named the N10, the company planned a revolution.

Keep Reading ↓Show less