ETL Developer, Resume Profile, Columbus OH

Reference Id: 83874     Posted on Sunday 20th September 2015      ETL Developer



Please note that this is a not a Job Board - We are an I.T Staffing Company and we provide candidates on a Contract and Full-time basis. If you need I.T.Professionals to fill a Contract or a Full-time Position, please call (800) 693-8939

 ETL Developer, Resume Profile, Columbus  OH

 

Experience Summary 

·         10+ Years of strong IT experience in Databases, Data Warehousing & Decision Support Systems.

·         Worked with all the phases of System Development and Project Management Life Cycles from analysis of business requirements, designing the implementation strategy, Data Analysis, relational database design and architecture, business/technical liaising, workflow and quality assurance.

·         Experience in Data Warehousing, OLAP, ETL and Business Intelligence

·         Having Knowledge on Hadoop,Pig,Hive,Hbase

·         Extensively worked with Relational (ER), Normalized, ODS/Staging, Star and Snowflake Dimensional Data Models.

·         Worked with EIM (Enterprise Information Management), ECM (Enterprise Content Management) and BI (Business Intelligence) Solutions and experience working with the Databases (Oracle, / and Access), Integration Tools (Informatica, and PLSQL Programming) and Business Intelligence solutions (OBIEE, AWM).

·         Wrote Database ETL/API programs using Oracle (9i/10g/11g) SQL & PLSQL (Procedures, functions, triggers, Packages, Views and Materialized logs & Views, Collections), CDC, and utilities (Data pump, Export/Import, SQL Loader, External tables).

·         Expertise in defining and implementing ETL Data load strategies including Type 1/2/3 Dimensions and facts (SCD), ETL Exception handling and Restart recovery logic for relational and flat files.

·         Worked with Informatica transformations such as Joiner, Lookup (Static, Dynamic, Persistent), Aggregate, Rank, Update Strategy, Transformation Control, Source Qualifier, Sorter, Normalizer, Filter, Stored procedure, Expression, and Router

·         Experience in SQL tuning by analyzing data access paths including Explain Plan and using Database hints, indexes and partitions

·         Good Knowledge on Informatica Data Quality,B2B,Master Data Management, Metadata Management

·         Practiced Informatica best practices including session partitioning and caching techniques.

·         Experience with Software Development Life Cycle (SDLC)

·         Wrote the low level technical design documents; developed, tested, Implemented and Maintained production ETL Mappings/Jobs.

·         Experience with Date warehouse methodologies both kimbal and Inmon

·         Experience in UNIX shell scripting and scheduling tools like Crontab, ,BMC Control-M, Autosys,

·         Wrote the Windows Batch Scripts, SQL and PLSQL Scripts to process the data to improve the automation process.

Technical Skills

·         Informatica 7.1.1,8.1.1 , 8.6.1 and 9.1 Power Center (Repository , Designer, Workflow Monitor and Manager)

·         Oracle Technologies: 9i, 10.0.2, 11.0.3 Databases; SQL, PLSQL (procedures, functions, triggers, packages, collections (Nested tables, Arrays, Associate Arrays)), Aggregate and Analytical Functions; Utilities: Import/Export, Data Pump, SQL Loader, External tables; Packages: redefinition, scheduling, mail

·         OBIEE: Administration, Presentation (Answers, Dashboards), BI Publisher and Analytical Workspace Manager (AWM)

·         Data Modeling/Other Tools: Erwin, Toad Modeler, PLSQL Developer, Toad for Oracle, Rapid SQL 7.2

·         Scripting: Windows and Unix Batch Program

·         Data Quality Tools: Informatica Data Quality

·         Microsoft Suite: Access Database; MS Excel; Excel Functions: VLOOKUP, Financial, Pivot, Dashboards and charts.

·         Other Databases: SQL Server 2005/2008, SSAS and SSRS; DB2 UDB, TeraData

 

Professional Experience

 

 

Confidential 

 

ETL Developer /Data Conversion Analyst

Confidential 

 

This is a Confidential  Project, In this project we converted all the Customer, Policy and Claims information  and loaded into Power Suite.

Bureau of Worker’s Compensation To protect injured workers and employers from loss as a result of workplace accidents, and to enhance the general health and well-being of Ohioans and the Ohio economy.

Since 1912, Ohio's workers' compensation system has helped employers and employees cope with workplace injuries by providing medical and compensation benefits for work-related injuries, diseases and deaths. BWC has a central office in Columbus and 14 customer service offices located across the state.

 

Responsibilities

·         Design and Develop Extract Transform and Load (ETL) programs to move data from legacy to target system using PL/SQL

·         Work with Subject Matter Experts to develop data validation/verification approaches to confirm

All necessary data converted, and converted correctly

·         Design data validation programs to support validation approach

·         Execute mock conversions, review results, document  investigate  and resolve conversion defects in a timely manner

·         Work with the conversion development team to create conversion routes

·         Support data cleansing efforts

·         Developed SQL Loader Script to load the Data from files provided by Source to Staging and created temporary tables as needed to improve the performance of the data loads.

·         Created PL/SQL Procedures to load data from source tables to staging tables.

·         Created Oracle PL/SQL  Cursors , Triggers , Functions and Packages .

·         Worked extensively on Oracle SQL tuning using Explain-plan, Oracle hints, indexes,

·         Efficiently created migration packages for migrating the code from DEV to TST, IQA and UAT and PROD environments

·         Worked as a temporary technical lead for managing several processes.

Environment: Oracle 10g, DB2,Oracle PL/SQL,FTP,SQL Loader, UNIX(HP-UX), SQL Developer ,Oracle Financials,

 

 

Confidential 

 

 

Sr ETL Informatica/PL/SQL Developer

Confidential 

 

 

Confidential  and manufactures industrial gas turbines for on- and off-shore electrical power generation. Solar Turbines is one of the world's leading producers of industrial gas turbines up to 30,000 horsepower .Solar Turbines Inc. has an existing data warehouse & multiple Datamarts for various business domains. The availability & latency of the Data warehouse & Datamarts needs to be improved to meet the business expectations. The current daily load takes around 10 hours to complete. to Reduce the Load time. Mainly This is a Conversion Project. Project  was already Developed in Oracle Warehouse Builder (OWB) now Converting into Informatica.

           

Responsibilities

·         Worked with various  internal teams to understand the existing systems and environments

·         Understand the Mapping Specification Documents and Business Rules

·         Extract the Data from Source (Oracle Tables) to Staging Develop the Business logic and load into Inventory Data Mart.

·         Developed Informatica Mappings, Sessions and Workflows using transformations (reusable & standalone)

·         Designed the lookup transformation and Sequence generator transformations .

·         Created ETL mappings including various transformations such as Source Qualifier, Lookup, Update Strategy, Expression and Stored Procedure, Filter, Router, Sequence Generator …….

·         Also increased the performance of the data loads by implementing session partitioning and database tuning.

·         Performed Unit testing and Performance Tuning testing

·         Developed  PLSQL procedures, functions, triggers, views

·         Involved in performance tuning of SQLs using Oracle optimization,  indexes, partitions and tuning join paths

·         Efficiently created migration packages for migrating the code from DEV to TST, IQA and PROD environments

Environment: Informatica 9.1,Oracle 11g,Oracle PL/SQL,Control -M,Faltfiles,XML files,Informatica Data Quality(IDQ),UNIX, Toad, Erwin,SQL ,Tidal

 

 

Confidential 

 

ETL Informatica/PL/SQL Developer

Confidential 

 

Purpose of this project is to provide data that will enable xx Cellular to deliver analytical information solution for   customer billing and payments.  This Enterprise Data warehouse Solution provides an integrated data store to allow for analysis across subject areas such as customer, product, service and revenue.

 

Responsibilities

·         Worked with various  internal teams to understand the existing systems and environments

·         Understand the Mapping Specification Documents and Business Rules

·         Analyzed, documented and reviewed the Database and ETL dependencies on the core Ratings as well as on the ratings provided by external vendors.

·         worked with Date warehouse methodology  kimbal

·         Extract the Data from Source (Operational Data Store) to Staging Develop the Business logic and load into Star Schema.

·         Developed Informatica Mappings, Sessions and Workflows using transformations (reusable & standalone)

·         Modified the existing ETL mappings to load the data from Operational Data Store  to Staging to Enterprise Data Warehouse

·         Created ETL mappings including various transformations such as Source Qualifier, Lookup, Update Strategy, Expression and Stored Procedure, Filter, Router, Sequence Generator

·         Created test cases/scenarios for QA

·         Involved in performance tuning of SQLs using Oracle optimization,  indexes, partitions and tuning join paths

·         Performed Unit testing and Performance Tuning testing

·         Developed PLSQL procedures, functions,  views

·         Efficiently created migration packages for migrating the code from DEV to TST, IQA and PROD environments

Environment: Informatica 8.6.1,Oracle 11g,,Oracle PL/SQL, Informatica, Control -M,XML files,Operational Data store (ODS),Autosys,SQL Server 2008,Cognos,Data Quality(IDQ),AIX, Erwin,Flat Files, Toad.

 

Confidential 

 

 

ETL Developer/PL/SQL Developer

Confidential 

xxxx is one of the   top Tele communications Company with presence all over the United States of America. The main objective of the project was to build an enterprise data warehouse for the E911 Information department. Informatica Power Center was used as an ETL Tool for implementing the extraction, transformation and loading of data into target warehouse databases

Responsibilities

 

·         Worked with various  internal teams to understand the existing systems and environments

·         Analyzed, documented and reviewed the Database and ETL dependencies on the core Ratings as well as on the ratings provided by external vendors.

·         Analyzed the business logic with DSO(Data Strategy & Operations) team and identified bugs in the existing code and efficiently fixed them

·         Developed Informatica Mappings, Sessions and Workflows using transformations (reusable & standalone)

·         Modified the existing ETL mappings to load the data from the core to target database  and data warehouse and from Staging to Reporting databases

·         worked with Date warehouse methodology  kimbal

·         Understand the Complete Software Development Life Cycle (SDLC)

·         Increased the performance of the data loads by incorporating temporary tables and decreased the lookup cache sizes

·         Created ETL mappings including various transformations such as Source Qualifier, Lookup, Update Strategy, Expression and Stored Procedure

·         Used Informatica Data Quality to get Quality data .

·         Used Informatica Data Quality(IDQ) to validate the Customer Address .

·         Increased the reusability of code by creating reusable components including mapplets, worklets  reusable sessions and created mappings which could be used in multiple sessions by configuring session parameters.

·         Also increased the performance of the data loads by implementing session partitioning and database tuning.

·         Efficiently created migration packages for migrating the code from DEV to TST, IQA and UAT and PROD environments

Environment: Informatica 8.6.1, Oracle 11g,,Oracle PL/SQL,Data Stage, Informatica, Operational Data store (ODS),Data Quality(IDQ), DAC, Unix, Autosys,SQL*Loader, SQL Server 2008,Business Objects,BOXI R2,Connect, CDC,SQL, Erwin,XML Files, Flat Files, Toad. Teradata, Fastload, FastExport, Multiload, TPump, 

 

Confidential 

 

 

ETL /Oracle Developer

Confidential 

 

Confidential hospitals across the United States are required to send details pertaining to the patients and the treatment they receive, to the state periodically so that they can be reimbursed for the services provided. This is a huge Data Warehousing Implementation involving ERWin, ETL Informatica and  Oracle, Flat Files and OBIEE OLAP tool.

 

Responsibilities

·         Worked with Business Analysts and Data Modeling Team in confirming the requirements and suggested changes in the design and implementation process accordingly

·         Wrote the ETL Technical Design (both High level and low levels) Documents and reviewed with the Implementation process with Data modeling team.

·         Created External Tables to read the data from files and to support the data analysis by DSO (Data Strategy & Operations) team.

·         Developed Informatica mappings to load the Data from files provided by Source to Staging and created temporary tables as needed to improve the performance of the data loads.

·         Developed loan performance exception process to load the data from Staging to Loan Performance Master

·         Created informatica mappings with several transformations including Source Qualifier, Expression, Update Strategy, Joiner, Expression, Lookup, Filter, Stored Procedure and more

·         Increased the reusability of code by creating reusable components including mapplets, worklets  reusable sessions and created mappings which could be used in multiple sessions by configuring session parameters.

·         Increased the performance by using session partitioning, best lookup caching techniques (Static and Dynamic) and extensive SQL tuning.  

·         Worked extensively on Oracle SQL tuning using Explain-plan, Oracle hints, indexes, table partitioning etc.

·         Modified and supported Oracle views for the Data feed team from which files were generated to external clients

·         Automated the Staging to Distribution process and created Unix scripts to support the same

·         Used Informatica Data Quality to get Quality data .

·         Used Informatica Data Quality(IDQ) to validate the Customer Address

·         Efficiently created migration packages for migrating the code from DEV to TST, IQA and UAT and PROD environments

·         Created UNIX scripts to automate the data loads from Files to tables and vice versa by FTP the files from/onto the EDX server.

·         Worked efficiently with large volumes of data as large as 125M

·          

·         Worked as a temporary technical lead for managing several processes.

·         Developed Reports using OBIEE

Environment: Informatica 8.1.1, Oracle 10g, ODI, Data Exporer, Nettza,UDB DB2,Data Profiling, T-SQL, Informatica Data Quality(IDQ),Oracle PL/SQL, FTP,SQL* Loader,Siebel, MDM,CDC,Erwin, XML Files,Business Objects,BOXI R2 Power Exchange, ODS,SQL Server 2005, Autosys,UNIX(HP-UX), Toad, IDQ,Oracle EBS, Netezza,Agile Methodology, OBIEE,DAC

 

Confidential 

  

 

Programmer Analyst/ETL Developer

Confidential 

                                      

xxxxxxx Bank handles commercial and consumer banking operations and offers financial products, services, ideas and solutions to customers and clients in 50 states. The Project objective is to design and develop a Single Integrated Data Warehouse for reporting the loan data for the organization, which enables the Loan servicing Department to track the loans and improve the overall performance. This Data Warehouse that replaced the legacy mainframe reporting tools is an Enterprise level source for ad-hoc and summarized loan information capable of supporting the growing loan portfolio and increased sales information.

 

Responsibilities

·         Analyzed the source systems including – Entities and Relationships; Indexes and Partitions, Cardinalities and reviewed Transformation/Business requirements

·         Worked extensively with GDA (Global Data Architecture) and Data Modeling teams on the Architecture, Design & Implementation processes and reviewed with the Operations and Product Management Team to meet the desired expectations.

·         Created Informatica mappings for loading the data from flat files and 3rd party Data providers into Staging

·         Worked on Slowly Changing Dimensions (Type 2) and handled huge volumes of transactional data ranging from 500M to 1TB.

·         Designed and developed around 30 ETL processes for loading the data from the source systems (Payroll and General Ledger) to Corporate Finance Data Repository

·         Defined ETL Data load strategies for loading various dimensions and Fact tables, developed ETL Restart recovery strategies (by capturing business keys) and ETL Exception Handling methods

·         Created Oracle stored procedures and functions to accommodate complex ETL logics

·         Migrated Informatica objects from DEV to QA, Pre-production environments and created Production Release packages.

·         Involved in performance tuning of SQLs using Oracle hints, indexes, partitions and tuning join paths

·         Worked extensively on tuning of mappings/sessions using session partitioning and lookup strategies to address bottlenecks.

·         Created Unit & QA test case templates and was responsible to validate the data as per the technical specifications

·         Built Data mart Cubes and OLAP cube views by identifying dimensions, measures, hierarchies and level dependencies

·         Created materialized views and logs using analytical and aggregate functions to provide the snapshots of the data for rollup at various aggregation levels

·         Provided Metadata and built Excel Reports for Executive and Financial Analysts for slice and dice analysis

·         Created UNIX scripts for generating parameter files, scheduling and executing Informatica workflows and created delivery notifications about the outcomes of the executions

·         Environment: Oracle 10g, oracle 9i, Informatica 8.1.1, Oracle EBS,PLSQL, T_SQL, SQL Server,Erwin,Toad,OracleApps,Salesforce,PowerConnect,B2B,SQL*Loader,BOXIR2,,Agile,OracleEBusiness,MSOffice(Excel,Access),PerlScripting,oracle optimization,Sybase,DB2,Autosys,Mainfrmas,FTP,Cobol,Power Exchange,

Confidential 

PLSQL Developer/ETL Developer

Confidential 

 

The Main Object of the Project is Maintenance Integration project consisting of New Depositor, Interest Warrant, Maturity, Premature & Renewal and Enquire modules. Annual percentage rates, interest period and amount dues on maturity date are calculated and Fixed Deposit, Interest, Warrants, premature withdraws, maturity receipts and monthly statements are delivered

 

Responsibilities:

·         Captured the business requirements and built process flow diagrams and charts

·         Worked with DBA team to understand the data model and published sourcetarget & application-DB tables mappings

·         Reverse Engineered existing database structures to translate into Entity-Relationship model and enhanced schema structures by incorporating indexing and partitioning strategies and created staging tables

·         Involved in integrating Enquiry module to the existing Data Model and published attribute definitions

·         Wrote PLSQL procedures, functions, packages, triggers, views  for data validation, extraction, transformation and loading processes

·         Wrote Ad-hoc SQLs and Created system test case documents and responsible for efficient running of applications

·         Responsible for unit and module testing and also participated in system and application integration testing

·         Enhanced query performance by analyzing data retrieval paths, tuning join paths and creating indexes, partitions, DB hints

·         Used Oracle utilities (Export/Import, SQL Loader, External Tables) for data loading and migration processes

·         Created Materialized logs and views and provided the current snapshots of the data

·         Scheduled the batch programs using dbms_scheduler, dbms_job advanced packages

·         Developed Informatica Mappings, Sessions and Workflows using transformations (reusable & standalone)

·         Modified the existing ETL mappings to load the data from Operational Data Store  to Staging to Enterprise Data Warehouse

·         Increased the performance of the data loads by incorporating temporary tables and decreased the lookup cache sizes

·         Created ETL mappings including various transformations such as Source Qualifier, Lookup, Update Strategy, Expression and Stored Procedure, Filter, Router, Sequence Generator

 

Environment: Oracle 9i, SQL Developer,Oracle utilities, SQL* Loader,PLSQL, SQL,SQL Optimization,Windows NT

 

Search Resumes

Looking for I.T Professionals on a Contract basis?