Legal and General, UK
• Gathering and analyzing the business and data requirements from clients and providing end to end solution
• Designing and reviewing the functional requirements and ensuring the coverage
• Coordinating with technical architects, business analysts and functional team and ensure pre-requisites and acceptance criteria are in place in the requirements captured.
• Leading the globally distributed team and ensuring the delivery as per the agreed SLAs
• Reviewing the progress and identifying the risks, dependencies, challenges and reporting and escalating when needed
• Perform Data Analysis for migrating Data from different sources to Hadoop eco-system
• Create and build Big Data clusters EMR, launch EC2 with S3 storage and RedShift database services and configuring ROUTE 53 in AWS cloud infrastructure
• Querying data from S3 and processing log data for analytics by cleaning and enriching data sets and schedules ETL jobs to transform and load the data using AWS Glue
• Used Athena for analyzing the different source data present in S3
• Create data ingestion flow/ data pipeline for processing in-house and traditional data.
• Load the data streaming into HDFS using Kafka connect and Kinesis in cloud environment
• Familiar with Python subroutines, libraries using IDEs like, Jupiter Notebook & PyCharm
• Design and engineer the data ingestion process to support multiple input types and Image Mapping, migrate Legacy applications data to Data Lake by ingesting data using SQOOP
• Created Hive external tables on top of the data loaded into HDFS
• Create Hive schemas using performance techniques like partitioning and bucketing
• Built Spark program using Scala to apply business logics on the data
• Storing the processed data in Hive tables.
• Porting the data from Hive to Oracle using Sqoop jobs
• Managing all the applications of the project set up on Cloudera environment.
• Involved in creating the data lake to bring all the source data for processing
• Data Migration, Extraction, and Loading and designed data conversions from wide variety of source systems.
• Created Hive structures to the data in HDFS and create external tables on top of them.
• Familiar with Python for data analysis and data validations experience using PyCharm
• Built Spark program using Pyspark to process and apply business logics on the data.
• Store the processed data in the target directory (HDFS).
• Scripts to dump the data from HDFS to Hive Partition table for data analysis and testing.
• Create Hive aggregated views to be fed into Tableau for reporting purpose
• Automate the sequence of steps by putting the code in a Scheduler
• Used Git for version control and GitHub/Bitbucket as version control repository
• Coordinating with cross-functional teams, technical and support teams, third party vendors, client
• Working in SCRUM to develop, deliver the requested and committed product increments as per change in Business requirements.
• Closely working with Project and test programme management to ensure quality and testing requirements and pre-requisites are in place and support in delivering artifacts.
Legal and General, UK
As a Big Data Lead:
• Building up and leading the Team to the success of the project
• Defining the scope within the context of each release
• Deploying and managing resources for both onshore and offshore teams
• Designing and overseeing the Migration activities to cloud environment
• Applying the appropriate test measurements and metrics in the Testing Team
• Creating ETL mappings using various transformations like Source Qualifier, Expression, Lookups etc in Informatica
• Creating and executing workflows and troubleshooting the mappings/workflows
• Practice CI/CD (Continuous Integration & Delivery) development model
• Creating, Reviewing Master Test Plan, Test Approach Document, QA Sign offs.
• Provide Test Scoping, Test Estimates and Test Strategy
• Reviewing the metrics, Test Artifacts, Migration plan and Data validation checklist
• Reviewing the Environment, Execution, Operational & Business Acceptance Criteria
• Planning, deploying and managing the testing effort for each release/engagement.
• Responsible for the quality delivery of the artifacts as per defined SLAs
• Expertise in Test Management and Defect Management
• Coordination of DB Team, Infra Team, Access Management Team, Vendors and Client
• Securing and engaging the necessary skilled resources to perform tasks as per plan
• Negotiating with Dev, Operations, Performance and Third party or client for support provisions when needed and challenging the unrealistic deadlines
• Host meetings with all stake holders and client for tracking the task status and Defects
• Primary responsible for Stakeholder Management, Risk Management and Defect Management
As a Database/Big Data Tester:
• Understanding the Business requirements, Technical Architecture and Design documents
• Responsible for testing database program and developing data model
• Responsible for validation of Data Staging, MapReduce and Output validations of Big Data system
• Data from different sources is validated and ensure right data is pulled into HDFS
• Comparing the source data with the data in HDFS to make sure both are matching
• Verifying data aggregations are implemented and key value pairs are generated
• Verifying that the transformation rules are applied on Data and no data corruption in target data
• Evaluating the performance by checking the speed of Data Ingestion and Throughput and then Data Processing by verifying the speed with which queries or MapReduce jobs are executed
• Handle the tasks of writing database code and monitoring database performance
• Perform responsibilities of entering data and processing information in computer systems
• Assigned the tasks of developing standards and guidelines for protecting database systems
• Responsible for evaluating work flow chart, correcting database errors, and modify database systems, when required
• Handle the tasks of developing security measures to protect data security systems
• Assisting development team or testers in executing test cases and troubleshooting defects
• Documenting test results by following standard procedures and guidelines
• Handled responsibilities of supporting QA Analyst in fixing database issues to meet client requirement
• Performed the tasks of updating test findings to the senior staff for recommendations
• Responsible for developing test plan procedures by coordinating with QA and Dev Team
• Interpreting the business requirements and coordinating with end user to understand the requirements.
• Creating Test Estimates, define Test Scope and Test Strategy
• Creating Test Plans, Test Approach, User Acceptance Test (UAT) plan, Business Acceptance Test (BAT), Production Acceptance Test plans (PAT)
• Identify HW/SW/tools required for test executions, to install/integrate them to lab environment to carry out end to end solution validation
• Environment Management by ensuring test data and setup in place.
• Risk Management by identifying the business and technical risks and documenting the mitigations and tracking for the closure
• Reviewing the test coverage by using Traceability Matrix
• Uploading the approved test cases to QC and assign and then execute under releases
• End to end functional and application knowledge of the project in hand
• Schedule the tests for execution and then monitor, measure, control and report on the test progress, the product quality status and the test results, adapting the test plan and compensating as needed to adjust to evolving conditions.
• Proven ability in assigning defect priority and severity using fundamental defect management experience and best practice
• Efficient Bug Triage management and delivery
• Understand, identify Configuration Items (CIs), Change and configuration management of all test artifacts
• Coordinating with third party suppliers, Customer for Test Planning, Reviewing, Test Setup, Execution, Penetration Testing, OAT, Defect Review and Test Reporting.
• Responsible for managing EUC for Desktops and VMs in the client site.
• Defect Management for multiple projects using HP ALM and Rational Quality Management (RQM) and JIRA
• Production of daily status reports, metrics, trending information and forecasting data
• Maintaining good rapport both with third party, client and end users.
Cisco Systems Inc., US
More Work Experience
Less Work Experience
• Understanding the requirements from Business Requirement (BRD), Functional Requirement (FRD) and Interface requirement documents.
• Test scoping and provides Test effort estimates.
• Creation of Test Plans and designing Test Cases and Test Scripts.
• Debugging the signaling call flows for SIP/SCCP using ethereal tool
• Analysis and review of test plans for all the new features
• Sustenance of internally built JTAPI Test Tool by adding new functionalities and enhancing the existing functionalities by using Java code
• Writing new functionality and automate the new features by writing code in Java and using APIs.
• Installation and upgrading Cisco Unified Communication Manager loads and setting up of test beds.
• Environment and Lab Maintenance - Server upgrade and migration and involved in building up Failover/Fall back setup
• Automation of Test Cases and execution using JTAPI Test Tool
• Perform Functional and Regression tests across all Windows/Linux releases of Unified Communication Managers, CTI and JTAPI
• Executing both manual and automated test cases
• Defect Verification and Test Reporting for different releases in Test Information Management System (TIMS), Jira and HP QC tools.
• Test Status reporting and Test closure activities.
• Documenting the learning and suggesting the process and effect improvement plans.
• Knowledge Transfer and responsible for leading training activities to new team members
• Primary responsible for Defect Prevention (DP) and Configuration Management (CM) activities.
Education and Training
More Education and Training
Less Education and Training
Computer Science Engineering
JNTU, Hyderabad, India
Bachelor Degree in Computer Science Engineering