Hadoop Workshops

ask us a hadoop or big data question


Hadoop deployment planning 2-day workshop

DOWNLOAD DATASHEET

An increasing number of organizations are seeking ways to profit from Big Data, and Hadoop is rapidly proving itself to be both a versatile and a capable enabler when it comes to unlocking the Big Data value proposition. Organisations adopting Hadoop as a key component in their enterprise architecture need to evaluate which Hadoop distribution they will implement in addition to considerations such as data security, performance, scalability and manageability.

Hadoop major components & features
– Components overview
– Application resource management
– Application resource governance

Workload considerations

Technical & Feature considerations between the major vendors – Cloudera, HortonWorks, MapR
– Data Protection & Security
– Management tools
– Deployment
– Upgrade
– Access controls
– Encryption
– Accounting, governance
– DR features

Review of enterprise system vendor offers – IBM, Oracle, EMC

Service management considerations
– Release management
– Scalability concerns & capacity planning
– Security best practises
– Archival and backup
– Monitoring, alerting
– Performance management, elasticity
– Reliability, durability

Hardware budgeting, selection & configuration
– Server architectures and options
– Switching, other network elements
– Racking considerations

SLA planning

Business Continuity planning

Staff training & readiness planning

Big Industries can run this workshop in our training facilities or at your premises.

DWH Optimization with Hadoop 1-day workshop

DOWNLOAD DATASHEET

Enterprises are looking for fresher data – from daily, to hourly, to real-time – as well as access to data from more sources and for longer periods of time. And they need it faster and cheaper. Meanwhile, traditional approaches for processing data in the data warehouse (ELT) can’t keep pace; and data warehouse costs are exploding along with data volumes.

One emerging strategy is data warehouse optimization using Hadoop as an enterprise data hub to augment an existing warehouse infrastructure. By deploying the Hadoop framework to stage and process raw or rarely used data, you can reserve the warehouse for high-value information frequently accessed by business users.

Big Industries can guide your organization and help you with offloading your DWH to Hadoop. We engage with you to design an architecture blueprint for augmenting your legacy DWH to increase capacity, maximize productivity and lower cost.

Overview

Workshop is over two half days
First half day:
* Overview Hadoop
* Overview typical Hadoop EDWH offload strategies
* Identify key stakeholder needs and critical success factors for EDWH
* Gap analysis of as-is environment (key areas for improvement)

Second half day:
* Presentation of findings
* Structure and prioritise proposed architectural description, goals and transition plan

Who should attend:
* Business users
* Technology procurement
* BICC team
* Enterprise/Data architects

The engagements are delivered by Rob Gibbon. Rob is an experienced Solution Architect with extensive applied knowledge of Hadoop and Hadoop ecosystem technologies.

 

Big Industries can run this workshop in our training facilities or at your premises.


Back to Top