Big Data Solution Architect Expert

Overview of Job Function: 

Develop the overall data vision and architecture strategy to ensure reusability, integration, and optimization of the data warehouse solution (architecture, logical/physical modeling, and tool selection). Identify technology gaps and suggest additional/alternate solution stack components; design various components of the architecture model and lead technical resource team to achieve project goals.

Principal Duties and Essential Responsibilities:  

  • Develop a data strategy and an actionable roadmap that provides core data services, advanced analytics capabilities and access to any and all required corporate data to various constituencies across the enterprise, enabling cross-functional collaboration and business insight.
  • Conduct required strategic assessment of existing or planned enterprise data ecosystem environment to identify opportunities for improvement and innovation.
  • Consult with data consumers across the organization, identify all current data sources, new/emerging data sources, data governance practices, DW, BI & ETL tools and systems in use, and data/analyses that live within various departments distributed across the organization.
  • Develop consolidated requirements that describe needs within and across business functions and distinct lines of business
  • Provide a gap analysis based on the capabilities of the current environment
  • Collaborate with internal constituents to determine desired end-state, existing technology stack, in-house talent, strategic goals, performance measures and specific requirements
  • Identify critical issues and potential areas of concern as well as areas for innovation and incorporate those into recommendation.
  • Provide recommendation of future state technical footprint with assessment of gaps and opportunities
  • Detail infrastructure design and outline technologies and processes to extend  current landscape to its future-state
  • Review existing practices and determine what, if anything, is needed to establish a data governance program.

Minimum Requirements:

  • Proven track record of minimum eight years’ experience designing and leading successful, large, complex data warehouse solutions integrated with Hadoop environments, pulling data from multiple sources
  • Five years’ experience in project leadership, including responsibilities for people, processes, and technologies
  • Five years of strategic consulting or similar experience
  • Hands on experience with Big Data technologies (Hadoop, Hive, Pig, Map Reduce, NoSQL)
  • Demonstrated expert level experience in Kimball dimensional data modeling methodology; ability to create conceptual, logical and physical models for operational and analytical systems
  • Extensive experience with and knowledge of multiple Data Warehouse/Data Mart architectures including Hadoop, various MPP and SMP database systems
  • Expertise in multiple traditional and Hadoop ETL development tools
  • Solid communication, presentation and analytical skills

Please contact us at jobs@casertaconcepts.com for more information or to apply.

omega watches replica replica montblanc cheap replica omega