As someone who writes predictive analytics software, I can say that you are dead on. Most of analytics is a process of doing one of two things:
1) Extrapolation of data where samples are lacking.
2) Taking big data and making it little data so that you can actually comprehend it.
My goal in practically every algorithm/tool I write is to take a several billion records and condense it into something that can be put in a spreadsheet, put on a chart, or rendered on a map. Analysts roll their eyes every time some big data guru releases another map with 7 trillion points on it. "Oh so you took 3 weeks rendering a map that looks like yet another population map, when you could have rendered this instantaneously with a choropleth?"
1) Extrapolation of data where samples are lacking.
2) Taking big data and making it little data so that you can actually comprehend it.
My goal in practically every algorithm/tool I write is to take a several billion records and condense it into something that can be put in a spreadsheet, put on a chart, or rendered on a map. Analysts roll their eyes every time some big data guru releases another map with 7 trillion points on it. "Oh so you took 3 weeks rendering a map that looks like yet another population map, when you could have rendered this instantaneously with a choropleth?"