DWH Optimization using Hadoop

ask us a hadoop or big data question

1-day workshop

Cost: 1.250€

Enterprises are looking for fresher data – from daily, to hourly, to real-time – as well as access to data from more sources and for longer periods of time. And they need it faster and cheaper. Meanwhile, traditional approaches for processing data in the data warehouse (ELT) can’t keep pace; and data warehouse costs are exploding along with data volumes.

One emerging strategy is data warehouse optimization using Hadoop as an enterprise data hub to augment an existing warehouse infrastructure. By deploying the Hadoop framework to stage and process raw or rarely used data, you can reserve the warehouse for high-value information frequently accessed by business users.

Big Industries can guide your organization and help you with offloading your DWH to Hadoop. We engage with you to design an architecture blueprint for augmenting your legacy DWH to increase capacity, maximize productivity and lower cost.

Overview of the workshop

Workshop is over two half days
First half day:
* Overview Hadoop
* Overview typical Hadoop EDWH offload strategies
* Identify key stakeholder needs and critical success factors for EDWH
* Gap analysis of as-is environment (key areas for improvement)

Second half day:
* Presentation of findings
* Structure and prioritise proposed architectural description, goals and transition plan

Who should attend:
* Business users
* Technology procurement
* BICC team
* Enterprise/Data architects

Rob Gibbon

The engagement is delivered by Rob Gibbon. Rob is a solution architect with hands on technical knowledge of Big Data system design, build and operation gained building solutions in varied domains with organizations ranging from upstart to blue chip.

 

Big Industries can run this workshop in our training facilities or at your premises.

Cost for the 1-day workshop: 1.250€


Back to Top