Who has waited for several minutes for a report to run only to finally realize a run-time error caused the report to fail? Yes, we all have. A new technology has been created that may help solve these headaches.
An emerging technology gaining press in the marketplace is in-memory computing. Although, the majority of organizations are in the “Evaluation” phase, early-adopters are seeing tremendous speed advantages when processing a large volume of data. This technology shifts data into Random Access Memory (RAM) or flash memory instead of transferring the data directly from hard drives (disk). In-memory computing is becoming popular now with the lower cost of computer memory. Also, the computer industry needed to develop new technologies for the processing bottlenecks when data is accessed from disk slower than the speed of the CPU.
With in-memory computing, all information is initially loaded into memory. One of in-memory computing software solutions that organizations are implementing is SAP High-Performance Analytic Appliance (SAP HANA). End-users will notice minimal change when processing a single transaction but reporting speed improves dramatically when processing large volumes of data.
In-memory computing may also help companies’ with their data analytic needs. Due to the performance concerns of running complex queries against production environments, organizations have invested heavily in IT resources, complex data warehouses and reporting cubes for their reporting needs. One of the problems with data warehouses is the level of detail that is typically available and the data integrity controls necessary to reconcile the warehoused data with the source system. Data warehouses are usually on a different technology platform from the source systems and do not provide access to real-time data. In the short term, in-memory computing may be used to help the data warehouse environments increase reporting processing speed and limit reporting run-time inefficiencies and errors.
The majority of organizations do not anticipate that in-memory computing will totally replace their data warehouse environment. From a long-term perspective, in-memory computing may cause a shift away from complex data models within data warehouses to generating reports and performing analysis directly against the data source without the concern of impacting application performance. However, organizations still need to consider the production data integrity risks if they choose to perform real-time analytics directly against the production environment. With these remaining risks, in-memory computing probably will not make data warehouses obsolete; however, it should enhance users’ productivity due to increased report run-time efficiency in both production processing systems and data warehouse environments.
Contact us with questions regarding in-memory computing and visit our Technology Services page to learn more about software solutions and additional Schneider Downs Technology consulting and network services.
Share
You’ve heard our thoughts… We’d like to hear yours
The Schneider Downs Our Thoughts On blog exists to create a dialogue on issues that are important to organizations and individuals. While we enjoy sharing our ideas and insights, we’re especially interested in what you may have to say. If you have a question or a comment about this article – or any article from the Our Thoughts On blog – we hope you’ll share it with us. After all, a dialogue is an exchange of ideas, and we’d like to hear from you. Email us at [email protected].
Material discussed is meant for informational purposes only, and it is not to be construed as investment, tax, or legal advice. Please note that individual situations can vary. Therefore, this information should be relied upon when coordinated with individual professional advice.
This site uses cookies to ensure that we give you the best user experience. Cookies assist in navigation, analyzing traffic and in our marketing efforts as described in our Privacy Policy.