Photo by Abhinav Sunil

The intersection of how technology platforms are evolving and how it both affects and aids sustainability is a topic that Forrester’s sustainability team is currently studying. We spoke with Dr. Suku Nair, one of the most respected computer scientists in the country. He currently heads the SMU AT&T Center for Virtualization and is a university distinguished professor at Southern Methodist University’s Department of Electrical and Computer Engineering.

Read this extremely interesting conversation on edge computing, digital twins, and more. And stay tuned for our report later this year that studies this balance in detail. My previous interviews in this series can be found here.

Abhijit: In my work, we see that the key business value of sustainability is how sustainability is synonymous with optimization. Anything you iteratively optimize gives you more efficiency and therefore more performance, pushing you toward sustainability. How do you view this?

Dr. Nair: When I talk about being green or sustainability, I talk about it from an engineering perspective. Being an engineer, this is one thing we always look for: We want to get maximum results with less. So, when we talk about reliability, security, or any design, we are trying to see how we can optimize performance with a better design. As an example, in network connectivity or security, there are always debates on why we need to go through many hoops to make it more efficient. But the idea is that as an organization, today you might have what you need to meet your rent, but tomorrow you may not — as we saw in several examples during the pandemic. Engineering is always built around optimization. That is what excites me when people look at sustainability now, and as engineers, we have been thinking about this all throughout history. So, I think even though the origins of many technologies are for different purposes, the moment you inject optimization into design, operations, and services, there will be some kind of sustainability-related metric that will come out of it as a secondary product.

Abhijit: Where do you see scope for more optimization now across the infrastructure stack — in a data center, for example?

Dr. Nair: When we talk about digitization, the digital economy, and services, everything runs through data centers. Many people do not realize that cloud, too, is nothing but data centers. Today we see the trend of migration of assets and services to the cloud, and sometimes enterprises only think about this as a cost reduction engine. But interestingly, in addition to reducing capex and opex, going to the cloud also helps with other advantages, including the associated benefits of new technologies that are integrated with cloud. Several studies also show that by migrating to public cloud, enterprises can save energy and therefore carbon emissions.

Of course, a public cloud strategy is not right for every enterprise. However, one study stated that if every enterprise moved to the public cloud today, it would be the equivalent of taking 20 million cars off the street. That is not something many people understand — how much efficiency comes from consolidated, optimized infrastructure.

But that doesn’t end the story there, since the architecture is still evolving. Now we have this on-premises to edge to cloud continuum. We will have services that will continue to be hosted on private cloud or servers but will also see a growth in edge computing infrastructure, which is becoming more and more prevalent along with the core cloud.

The question is how this is going to affect energy consumption. The advent of edge computing echoes the elastic nature of technology: Initially, you want to move everything to a consolidated, centralized cloud infrastructure; then people realize, in order to get optimal latency and user experiences, you have to have some of the services or parts of the services hosted on the edge side of the network. When you have a lot of data being kept and processed at the edge, you do not have to transmit that much data to the cloud. It turns out that transmission is more power consuming than actual processing. So, by keeping the data locally, not only can you provide better computing experience because of the low latency, but also [you can attain] better privacy, better flexibility, and potentially lower energy consumption.

Another analogy is that edge is almost becoming the coprocessor, if you will. In the early days, you have the main CPU; and then if you want to accelerate a particular computational function, you use a specialized processor called a coprocessor. So now the cloud is becoming the CPU and the edge is becoming the coprocessor, which can selectively optimize different application services while the data comes from the customer.

So, this continuum is going to be worked out very carefully. There is a very stringent energy balance between this state and new applications that will come, and new energy budgets need to be factored in. But concurrently, we need to come up with more optimization technologies, too.

Abhijit: At Forrester, we are looking currently at how various technologies aid and adversely affect sustainability. What are your views on edge specifically? Is there more data on this?

Dr. Nair: Data collection is on the rise. For example, right now one of the key discussions that is going on is sentient computing vs. energy-efficient computing. Sentient is when you “sensor-ize” the whole world, like smart cities or smart apps and smart transportation. Internet-of-things (IoT) devices are now sown all over our ecosystem. The question is, “Are you going to collect all this data and keep sending it back to the cloud for processing?”

When you have IoT devices, having to connect to cloud means the latency will be much longer — let us say as a broad example, 100 milliseconds vs. 10 milliseconds. So, if an IoT device has to communicate to the cloud, it will consume more power than if it were to communicate to an edge server: in our broad example, with 10 milliseconds, one-tenth of energy consumption. Many of these IoT devices are energy constrained — they are charged one time, and there is no replenishment.

Additionally, we must make sure these devices are not on for too long. When you have all these sensors deployed, you should have some kind of scheduling, so that some of the sensors will be on sometimes and sometimes off. Those kinds of very close control and manipulation are only possible when the control is close to the premises.

If you really look at the ecosystem end to end, there is a push to “software-ize” the stack. What this means is you can have general-purpose computing platforms, and different applications will be dedicated to using different software systems. So, that will immediately cause consolidation and the reduction of the hardware footprint, which means less energy consumption and less carbon emissions.

Even from the business perspective, new ecosystems are evolving for edge. For example, right now if you look at some of the data centers owned by one particular party — it could be, as an example, AT&T or Equinix — traditionally, data centers are big entities; but at the edge, it could be a micro data center. The initial thought process in the industry was when these hundreds of thousands of micro data centers are going to be deployed all around the world, they are going to be individually owned. But it turns out in the new business model being adopted now, even micro data centers may have multiple providers/vendors that are occupying a single space. So, the ecosystem is evolving, and even the edge units are going to be coinhabited by different parties. The objective here is to minimize resource consumption.

This makes the back-end algorithms more difficult and complex; but still, once it is developed as a one-time effort, it can be put across any number of units, and efficient resource sharing can be enabled.

Abhijit: Let us talk about virtualization. Even now after many years of evolution, virtualization is brought forth as one of the initiatives for making the data center more sustainable. How do you see virtualization techniques, tools, and platforms evolving? Is it at a point where it no longer needs to be part of the conversation? Is there more research into that field?

Dr. Nair: If you look at the history of the SMU AT&T Center for Virtualization, we started talking about virtualization when we were doing some projects for AT&T in the telecom virtualization space, and then we thought it would be much more interesting if we broadened the scope. That’s when we said this will be a center for virtualization, and we’ll talk about telecom virtualization, enterprise virtualization, and user experience virtualization.

Let’s talk about the role of virtualization and sustainability.

There is this concept that is becoming mainstream — digital twins. The idea is we should be able to virtualize any hardware artefact for all the characteristics that are salient or observable to us or important to us. Today, we talk about digital twins for entire naval warships.

Here, we will not virtualize every nut and bolt, but we will decide what is the level of granularity we want to see and virtualize it.

To illustrate the massive utility of this, let us take another important example: flight simulators. We do work with companies that produce the flight simulators for the US Air Force.

A student pilot can get trained for many hours in a simulator and learn about many maneuvers and situations. Most of the time, when a commercial pilot gets into the cockpit, he has done more than 80% of his training in a simulator. It may sound a little bit unsettling, but it is very efficient because they will emulate all the possible scenarios and learning maneuvers to escape from different danger zones. In a simulator, a pilot can sit in its safety and crash multiple times doing the wrong thing. Whereas, if you are flying in the real plane, that would be fatal. But what we do not realize is how much carbon emission is avoided by not having the pilot fly the real plane while training. This is an application of digital twin of the cockpit.

So, now consider other design projects like cars, planes, even artefacts that sit on your table. We can create their digital twins and study all the characteristics and properties before we manufacture them. It saves time, resources, energy, and, of course, carbon.

One other area that we advocate the use of virtualization is in education. We can virtualize an entire chemistry lab or physics lab. A student may not touch the test tube, but she can get very close to the experience and see how things will interact. She can do 80% of work in the virtual lab and then the rest in a real lab, and we are already reducing the consumption of chemicals.

Another example I can give you is cybersecurity and supervisory control and data acquisition systems. When we teach these to students, sometimes we have to get very expensive equipment from a company like GE or Siemens and then work with that equipment to see what the security holes are and how we can defend against them. But with simulation or virtualization, we can have those large boxes simulated in their entirety, and then students can experiment with them and learn the security features. So, virtualization has a massive role. Digital twin may sound a little bit too futuristic, but it is already happening at some scale, and it will happen end to end — from design, development to deployment cycle, starting from your coffee mug to an airplane.

So, is it ready for mainstream? I would say yes. It is definitely happening, and technologies like AI are definitely helping it even more by helping automate many processes where no human intervention is needed.

When we say energy, we are not just talking about the nuclear or fuel; it’s human energy, too. We have to look for efficiency at all levels.

Abhijit: Thank you, Dr. Nair, for your time and for sharing insights about the work you lead in this space.


This interview was conducted by Forrester Analyst Abhijit Sunil in association with Researcher Renee Taylor. To learn more about Forrester’s research on technology sustainability, reach out to or