Off to the Races: Big Data and the Need for Speed
Time is money, especially if you’re a Lucas Oil Off Road racer: First place could mean $2,100. But standing in the winner’s circle doesn’t come easy—it takes guts and grit, backed by sophisticated timing systems able to sort out even the closest race. When one hundredth of a second means the difference between victory and defeat, mistakes can’t happen; timing data must be both precise and delivered on demand.
Enterprises share a similar concern when it comes to big data. Accurate and timely analysis gives companies an edge over their competition, helping them understand when it’s time to stomp on the gas and when they need to take it slow. Bottom line? You can’t afford to lose this race.
Race Day Analytics
It’s easy to write off data analytics tools as hype; the costs of cloud computing, as-a-service deployments and bring-your-own-device (BYOD) adoption make it tempting to avoid a clear-cut data strategy. Many industries—notably healthcare—actively resist the pull of big data.
The problem? Using legacy systems is like timing a Formula 1 race with a stopwatch and human sight— there’s a better way, and it’s paying dividends. Inis Motorsport, for example, supplies the live timing technology used in all Lucas Oil Off Road races and is able to push results in real time across PCs, mobile devices and via Web browsers.
The system uses a series of transponders, which monitor each car as it crosses the finish line providing lap, best and gap time data instantly. For enterprises? Sophisticated data analytics tools now on the market are able to churn through vast amounts of structured and unstructured information, uncovering patterns and trends as they go. And according to a recent Forbes article, cutting-edge solutions work in real time.
V for Victory?
Beyond intelligent systems, companies need skill. Data to mine isn’t enough on its own—businesses need to use the right data sets, ask the right questions and be prepared to act on results immediately. But the move away from gut feelings and corporate experience to hard data can be daunting. As described by IBM, however, it’s possible to evaluate the strength of your data using what Big Blue calls the “4Vs”: Volume, velocity, variety and veracity.
Sound complicated? Consider the example of a race track with hundreds of cars zooming past the finish line. This is the enterprise server; the cars are bits of collected data. To offer value, there must be a certain volume of cars on the track: One or two don’t make a race. It must also be possible to analyze data with a certain velocity—as each car crosses the line, times must be posted instantly.
In addition, data needs veracity, which refers to the logical consistency both of data sets and results. Sets must exhibit at least some commonalty; if the race is made up of two funny cars, three monster trucks and a vintage sedan, the results won’t be usable. Finally, results reporting must be reliable—if widely different lap times are reported for similar vehicles, something isn’t right.
The Final Data Lap
Variety is where many enterprises get bogged down. Instead of a closed track, the flow of big data is like continually adding new cars to the course until the ground is fairly littered with wreckage.
Consumer data, internal data, Web analytics data—all are part of a larger whole, but can make the total amount of information available to a company seem impossibly huge. Your best bet? Start small. Find a web host that supports popular analysis tools. When it’s time to go bigger, consider an SaaS deployment with a narrow focus, followed by a custom-built or in house solution.
Don’t get left behind; real-time analytics can help make sure you never miss the checkered flag.