Once a week.
Subscribe to our weekly newsletter.
GM Is Insourcing Its Data Centers: What’s Your Plan to Leverage High-Value Data?
Recently General Motors announced that they’re building a new, $258 million enterprise data center in Moring, Michigan. With it, they are going from 23 outsourced data centers around the world to 2 data centers that are, in essence, in-sourced. This move of bringing the data centers back to Michigan and back to the full control of GM is a complete reversal of where they were.
So why are they doing this? To reduce costs? Remember that they initially outsourced their data centers to reduce costs. However, with this new move, GM says they can reduce costs by an additional 40%. In other words, they initially outsourced to reduce costs, but now they’re in-sourcing to reduce even more costs, and they’re consolidating and getting all the data in one location. On the surface, this almost doesn’t make sense.
Actually, it makes perfect sense. To better understand why this is a strategic move for GM, we have to look at our three change accelerators—processing power, storage, and bandwidth. The exponential advances that have been taking place in all three areas have reached unprecedented levels. You’ve likely heard the story about what happens when you double a penny every day. Tomorrow you’d have two cents; the next day, four, the next eight, and so on. By the end of the week, you would have a whopping sixty-four cents. By the end of week two, your cache of cash would have grown to $81.92. Not too exciting. But by day twenty-eight, just two weeks later, your pile of pennies would exceed $1 million; on day thirty it would be over $5 million. If this happened to be a thirty-one-day month, you would end the month with more than $10 million.
If doubling a penny and suddenly reaching $10 million seems dramatic, imagine this: what if the next month, you started with that $10 million and kept doubling? That’s the change level we’re approaching with the three accelerators. Consider this: what was considered the world’s fastest super computer two years ago was recently disassembled because it was obsolete. And of course, as the power of those three change accelerators continue to increase dramatically and exponentially, their price continues to drop. So we can do much, much more with much, much less.
But that’s not the only thing driving GM’s decision to in-source their data. The nature of big data and high speed data analytics is changing too. Not only are companies creating more data than ever before, but the data they are creating is much more valuable. Here’s an example.
The latest plug-in electric vehicles produce 25 gigabytes of data an hour. Some of that data is sent to the driver’s smart phone so they know about the car’s battery life, tire wear, vehicle performance, where the nearest plug-in stations are, plus many more things. Thanks to all this data, the driver as well as the service center can do predictive analysis of the car, which is basically being able to predict car troubles before they occur. Now the driver can fix the problem before it manifests, thus eliminating the car from unexpectedly breaking down.
The data the car produces also goes to the car maker so they can track customer satisfaction and vehicle performance, enabling them to make better vehicles in the future. In fact, the car maker can learn what’s happening with the cars in real time, which enhances their ability to continuously innovate. In this sense, data increasingly becomes the company jewels. Because there is an amazing amount of data being generated, and because the data is far more strategic, companies can get active intelligence from it to make better decisions in real time. No wonder GM wants all their data in-house.
Now, this doesn’t mean that every company should have their own data center or copy what GM is doing. Many companies utilize software as a service (SaaS) to lower their software and hardware costs, and hardware as a service (HaaS) for the data storage. Those are valid options for many organizations. There are so many services that can be cloud-enabled and virtualized that we are now seeing everything as a service (XaaS) rapidly emerge, for example collaboration as a service (CaaS).
The key is to do what’s best for your company today, based on the hard trends that are shaping the future and regardless of what may have worked in the past. Therefore, you need to ask yourself:
° What kind of business are we?
° What industries are converging to create new opportunities?
° What is the size and reach of our business?
° What are the ideal short, mid, and long range goals for our organization?
° How much agility do we need to stay ahead of the competition?
° How much data are we producing now and how much do we plan to produce in the near future?
° What is the value of the data we have and are now capable of collecting?
° What kind of competitive advantage can our data help us create?
Not every company generates as much data as GM. And not every company has to track hundreds of thousands of parts and supplies. But every company creates data and will create much more in the future, and that data is increasingly becoming the key to your organization’s growth. Therefore, it’s imperative that you think through your data plan so you can leverage your data to solve problems faster, make smarter decisions, and reach your goals faster.
Remember, too, that because the three change accelerators of processing power, storage, and bandwidth are still growing and will continue to do so, you need to re-evaluate where you are often. Even though GM is bringing their data centers back home, they’ll have to look at their current strategy again in just a few years.
Times are changing fast, and the rate of change will only increase as times goes on. So what works today may not work two years from now. Therefore, whatever your company does or decides is best for today, re-evaluate that strategy often. Look at your data and where your competitive advantage is coming from so you can take advantage of the newest technologies and not be trapped in the past.
If you keep doing what you’ve always done in the midst of rapid change, you’ll lose your competitive advantage. You either change with the times, or you get left behind. Which option makes the most sense for your company?
Certain water beetles can escape from frogs after being consumed.
- A Japanese scientist shows that some beetles can wiggle out of frog's butts after being eaten whole.
- The research suggests the beetle can get out in as little as 7 minutes.
- Most of the beetles swallowed in the experiment survived with no complications after being excreted.
In what is perhaps one of the weirdest experiments ever that comes from the category of "why did anyone need to know this?" scientists have proven that the Regimbartia attenuata beetle can climb out of a frog's butt after being eaten.
The research was carried out by Kobe University ecologist Shinji Sugiura. His team found that the majority of beetles swallowed by black-spotted pond frogs (Pelophylax nigromaculatus) used in their experiment managed to escape about 6 hours after and were perfectly fine.
"Here, I report active escape of the aquatic beetle R. attenuata from the vents of five frog species via the digestive tract," writes Sugiura in a new paper, adding "although adult beetles were easily eaten by frogs, 90 percent of swallowed beetles were excreted within six hours after being eaten and, surprisingly, were still alive."
One bug even got out in as little as 7 minutes.
Sugiura also tried putting wax on the legs of some of the beetles, preventing them from moving. These ones were not able to make it out alive, taking from 38 to 150 hours to be digested.
Naturally, as anyone would upon encountering such a story, you're wondering where's the video. Thankfully, the scientists recorded the proceedings:
The Regimbartia attenuata beetle can be found in the tropics, especially as pests in fish hatcheries. It's not the only kind of creature that can survive being swallowed. A recent study showed that snake eels are able to burrow out of the stomachs of fish using their sharp tails, only to become stuck, die, and be mummified in the gut cavity. Scientists are calling the beetle's ability the first documented "active prey escape." Usually, such travelers through the digestive tract have particular adaptations that make it possible for them to withstand extreme pH and lack of oxygen. The researchers think the beetle's trick is in inducing the frog to open a so-called "vent" controlled by the sphincter muscle.
"Individuals were always excreted head first from the frog vent, suggesting that R. attenuata stimulates the hind gut, urging the frog to defecate," explains Sugiura.
For more information, check out the study published in Current Biology.
The design of a classic video game yields insights on how to address global poverty.
Poverty can be a self-sustaining cycle that might require an external influence to break it. A new paper published in Nature Sustainability and written by professor Andrew Bell of Boston University suggests that we could improve global anti-poverty and economic development systems by turning to an idea in a video game about a race car-driving Italian plumber.
A primer on Mario Kart
For those who have not played it, Mario Kart is a racing game starring Super Mario and other characters from the video game franchise that bears his name. Players race around tracks collecting power-ups that can directly help them, such as mushrooms that speed up their karts, or slow down other players, such as heat-seeking turtle shells that momentarily crash other karts.
The game is well known for having a mechanism known as "rubber-banding." Racers in the front of the pack get wimpy power-ups, like banana peels to slip up other karts, while those toward the back get stronger ones, like golden mushrooms that provide extra long speed boosts. The effect of this is that those in the back are pushed towards the center, and those in front don't get any boosts that would make catching them impossible.
If you're in last, you might get the help you need to make a last-minute break for the lead. If you're in first, you have to be on the lookout for these breakouts (and the ever-dreaded blue shells). The game remains competitive and fun.
Rubber-banding: A moral and economic lesson from Mario Kart
In the real world, we see rubber-banding used all the time. Welfare systems tend to provide more aid to those who need it than those who do not. Many of them are financed by progressive taxation, which is heavier on the well-off than the down-and-out. Some research suggests that these do work, as countries with lower levels of income inequality have higher social mobility levels.
It is a little more difficult to use rubber-banding in real life than in a video game, of course. While in the game, it is easy to decide who is doing well and who is not, things can be a little more muddled in reality. Furthermore, while those in a racing game are necessarily antagonistic to each other, real systems often strive to improve conditions for everybody or to reach common goals.
As Bell points out, rubber-banding can also be used to encourage sustainable, growth programs that help the poor other than welfare. They point out projects such as irrigation systems in Pakistan or Payments for Ecosystems Services (PES) schemes in Malawi, which utilize positive feedback loops to both provide aid to the poor and promote stable systems that benefit everyone.
Rubber-banding feedback loops in different systems. Mario Kart (a), irrigation systems in Pakistan (b), and PES operations in Malawi (c) are shown. Links between one better-off (blue) and one worse-off (red) individual are highlighted. Feedback in Mario Kart (a), designed to balance the racers, imprAndrew Bell/ Nature Sustainability
In the Malawi case, farmers were paid to practice conservation agriculture to reduce the amount of sediment from their farms flowing into a river. This immediately benefits hydroelectric producers and their customers but also provides real benefits to farmers in the long run as their soil doesn't erode. By providing an incentive to the farmers to conserve the soil, a virtuous cycle of conservation, soil improvement, and improved yields can begin.
While this loop differs from the rubber-banding in Mario, the game's approach can help illustrate the benefits of rubber-banding in achieving a more equitable world.
The task now, as Bell says in his paper, is to look at problems that exist and find out "what the golden mushroom might be."
Satellite imagery can help better predict volcanic eruptions by monitoring changes in surface temperature near volcanoes.
- A recent study used data collected by NASA satellites to conduct a statistical analysis of surface temperatures near volcanoes that erupted from 2002 to 2019.
- The results showed that surface temperatures near volcanoes gradually increased in the months and years prior to eruptions.
- The method was able to detect potential eruptions that were not anticipated by other volcano monitoring methods, such as eruptions in Japan in 2014 and Chile in 2015.
How can modern technology help warn us of impending volcanic eruptions?
One promising answer may lie in satellite imagery. In a recent study published in Nature Geoscience, researchers used infrared data collected by NASA satellites to study the conditions near volcanoes in the months and years before they erupted.
The results revealed a pattern: Prior to eruptions, an unusually large amount of heat had been escaping through soil near volcanoes. This diffusion of subterranean heat — which is a byproduct of "large-scale thermal unrest" — could potentially represent a warning sign of future eruptions.
Conceptual model of large-scale thermal unrestCredit: Girona et al.
For the study, the researchers conducted a statistical analysis of changes in surface temperature near volcanoes, using data collected over 16.5 years by NASA's Terra and Aqua satellites. The results showed that eruptions tended to occur around the time when surface temperatures near the volcanoes peaked.
Eruptions were preceded by "subtle but significant long-term (years), large-scale (tens of square kilometres) increases in their radiant heat flux (up to ~1 °C in median radiant temperature)," the researchers wrote. After eruptions, surface temperatures reliably decreased, though the cool-down period took longer for bigger eruptions.
"Volcanoes can experience thermal unrest for several years before eruption," the researchers wrote. "This thermal unrest is dominated by a large-scale phenomenon operating over extensive areas of volcanic edifices, can be an early indicator of volcanic reactivation, can increase prior to different types of eruption and can be tracked through a statistical analysis of little-processed (that is, radiance or radiant temperature) satellite-based remote sensing data with high temporal resolution."
Temporal variations of target volcanoesCredit: Girona et al.
Although using satellites to monitor thermal unrest wouldn't enable scientists to make hyper-specific eruption predictions (like predicting the exact day), it could significantly improve prediction efforts. Seismologists and volcanologists currently use a range of techniques to forecast eruptions, including monitoring for gas emissions, ground deformation, and changes to nearby water channels, to name a few.
Still, none of these techniques have proven completely reliable, both because of the science and the practical barriers (e.g. funding) standing in the way of large-scale monitoring. In 2014, for example, Japan's Mount Ontake suddenly erupted, killing 63 people. It was the nation's deadliest eruption in nearly a century.
In the study, the researchers found that surface temperatures near Mount Ontake had been increasing in the two years prior to the eruption. To date, no other monitoring method has detected "well-defined" warning signs for the 2014 disaster, the researchers noted.
The researchers hope satellite-based infrared monitoring techniques, combined with existing methods, can improve prediction efforts for volcanic eruptions. Volcanic eruptions have killed about 2,000 people since 2000.
"Our findings can open new horizons to better constrain magma–hydrothermal interaction processes, especially when integrated with other datasets, allowing us to explore the thermal budget of volcanoes and anticipate eruptions that are very difficult to forecast through other geophysical/geochemical methods."