Before discussing a topic is always good to begin with a definition and what better way to find a relative to see Wikipedia omniscient (as we got before). Wikipedia has this to say about In-memory databases.
Source: Wikipedia, The Free Encyclopedia
Or simply paraphrase, In-memory databases are more efficient than traditional database management as do not face the same i/o constraint of reading or writing to disk.
Really like the ubiquitous hype ' big data ', In memory is in vogue for 2013. However, let's not forget that in-memory databases have been around for 25 years or more and was largely an economic constraint (the price of RAM) rather than technology that prevented their use being more prevalent. However the pace of technological progress saw the price drop of RAM greatly with numerous sources identify the price of memory has dropped the 33% per annum for the last two decades and the same sources expect a similar rate of reduction in the years to come.
It's hard to deny that business demand for performance in Google analytics is always there. After all, as soon as possible the business has the answers before we can make decisions, which generally lead to greater business value as a result.
However, we still need to go back to the economic argument of storing data in memory. You must be in memory, more to the point it can afford all your data to be in memory all your data?
Yes, the price of memory has dropped significantly and continues to do so but the memory remains 80 times more expensive than disk storage. Add to the equation the rate at which the data volumes are growing and the need to capture, store and analyze this growing data volumes and you are left with two opposing tendencies that might not ever, or at least not for the foreseeable future, reach a balance point acceptable price.
So logically and (being from a background of accounting) economically there must be more storage options for the data within a unique ecosystem analysis in memory will play a significant role. Of course for the data that is used heavily on a day to day basis then there will be a more convincing business case to store this data in memory to provide benefits that are necessary for business, but for the historical data infrequently accessed or do the numbers really stack up to to have this sitting in memory of data?
For information about how Teradata intelligent memory addresses versus storage equation of use click here.
David Hudson is a Senior Consultant with Teradata solutions ANZ. Has 10 years experience in data storage, primarily focused on Enterprise Data Model solutions. This includes data integration, ETL design and logical data modeling.
No comments:
Post a Comment