Walmart has continued to expand their Big Data practices. Their objective is “to know what every product in the world is, to know who every person in the world is and to have the ability to connect them together in transaction.”
Rather ambitious goals that Neil Ashe, Walmart CEO of Global E-commerce, stated some time ago. However, seeing their big data strategy, how they have progressed in the past two years and how they made big data part of their DNA it is not an unforeseeable future.
Currently Walmart processes over 40 Petabytes of data, per day. They have the second largest in-memory platform in the world. Billions of rows of data are mined every day in order to get valuable information about their 11.000 stores located in 27 countries, 11 ecommerce website or 250 million weekly customer needs.
They have build applications that rely on over 200 internal and external datasets, which are integrated into one big comprehensive Big Data platform. This means they can layer in things like weather, product sales status, pricing, inventory and a lot more – all to anticipate the needs of their 250 million weekly customers.
They use data from for example the weather, social events, economy or competitors to create 1000s of variables that can be used to predict sales and demands for current stores, but also predict first year demand of new stores. As well as use these variables to determine new store locations across the globe.
Walmart is a true giant in the field of Big Data Analytics and they are likely to continue to expand their practices in the future. The above video shows the different Big Data capabilities at Walmart and it is truly impressive.