Job Listing Information
- 06-Jan-2022 to Until Filled (UTC)
- Trenton, NJ, USA
- Full Time
- Long Term Contract Length
Contractor will work remotely with the expectation that they will come into the office to work On-Site a few times per month or more frequently if policies change. As such, Contractor needs to live locally.
This SDM position is for designing and implementing modern data modeling solutions using relational, dimensional, snowflake, star, data lake that meet Judiciary needs using conceptual, logical, and physical data models. The SDM should be capable of modelling for various data bases like AWS Redshift, S3, Dynamo DB , DB2, Postgress.
Senior Data Modeler (SDM) Specialist will be responsible for the design, develop & implementation of Data Warehouse database structures and Analytical Database structures. SDM will work with Business groups and Application Teams during pre & post assessment periods. The SDM reports to Supervisor/Team Lead. DSM should be able to write complex SQL’s ,write and execute Stored Procedures for analysis for data . SDM should understand the needs of the Business team and BI team for building proper Data Warehouse and DataMart. Should be capable of developing structured and unstructured data model for BI and Analytics.
- Understand and translate business needs into data models supporting long-term solutions for different Judiciary systems.
- Work with the Application Development teams (ex: Pega Applications) to implement data strategies, build data flows and develop conceptual data models.
- Create logical and physical data models using best practices to ensure high data quality and reduced redundancy.
- Optimize and update logical and physical data models to support Data Initiative project for different phases of implementation.
- Develop best practices for standard naming conventions and coding practices to ensure consistency of data models.
- Recommend opportunities for reuse of data models in new environments.
- Perform reverse engineering of physical data models from databases like DB2 and SQL scripts.
- Evaluate data models and physical databases for variances and discrepancies across the different source systems in Judiciary.
- Analyze data-related system integration challenges and propose appropriate solutions and develop the model according to Judiciary Standards.
- Work with all IT teams like System Analysts, Engineers, Programmers, BI and others on project limitations and capabilities, performance requirements and interfaces.
- Review modifications to existing software to improve efficiency and performance.
- Candidate should have over all 8+ years of Data Modelling experience.
- Minimum 4 years of overall experience and 3 years into Bigdata Technologies
- Strong work experience in Data modeling and usage of data modeling tools.
- Exposure with Snowflake’s Cloud Data Warehouse, if possible
- He/she should be well versed in understanding relational data, dimensional data, unstructured data, and master data.
- Should have recent data management trend skills such as developing data lake model along with traditional modelling practices like snowflake and star.
- They should know modeling techniques latest that are adapted for data warehousing and analytical structures.
- Able to build logical data model and translate logical model into physical database.
- They must be able to compare actual project procedures to the specified standards and procedures.
- They should be able to demonstrate expertise with the most recent and relevant technologies in the data modelling implementation.
- They must have excellent communication skills and possess the ability to collaborate with internal and external groups including vendors.
- They must have the ability of working independently and with minimal supervision
- 8+ years of experience in Data Modeling for Databases like DB2, AWS’ Redshift, Dynamo DB, Postgres, SQL Server, Pega Application.
- Data modeling software (6 years)
- AWS and Other Cloud systems experience is must.
- Strong SQL and data profiling skills in various relational databases.
- Analyzing source databases, source data, source database referential integrity and profiling source data (8 years)
- Database tuning techniques (minimum 4 years)
- Experience in building Data Lakes for both structured and unstructured data (2 years)
Required / Desired Skills
Experience in Data Modeling for Databases like DB2, AWS’ Redshift, Dynamo DB, Postgres, SQL Server - Required 8 Years
Data modeling software - Required 6 Years
AWS and Other Cloud systems - Required 3 Years
Strong SQL and data profiling skills in various relational databases. - Required8 Years
Analyzing source databases, source data, source database referential integrity and profiling source data - Required 8 Years
Database tuning techniques - Nice to have 4 Years
Experience in building Data Lakes for both structured and unstructured data - Required 2 Years
4 years College Degree or equivalent technical study - Required 4 Years