0% found this document useful (0 votes)
23 views6 pages

Java Fullstack Developer 10+Yrs

fdfd

Uploaded by

vijaydstaffing
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
Download as docx, pdf, or txt
0% found this document useful (0 votes)
23 views6 pages

Java Fullstack Developer 10+Yrs

fdfd

Uploaded by

vijaydstaffing
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1/ 6

Naveen Undrathi

Sr Full Stack Java Engineer


Email: naveen.undrathi@gmail.com | Phone No: (469) 826- 2992
LinkedIn: https://www.linkedin.com/in/naveen-undrathi-594b37ba/

Professional Summary

 Innovative java full stack Software Engineer with 10+ years of strong experience in software analysis,
design, development, implementation and testing of Web based Enterprise applications
developed using Java, Spring MVC, Spring Boot, Python.
 Expert in Java8 and extensive experience in writing RESTful Web Services and Micro Services using Spring
Boot and documentation of services using Swagger API.
 Experience in developing, deploying applications and Micro Services architecture and good experience with
Spring framework, Spring Boot, Spring Data JPA and Hibernate.
 Experience in using development IDEs like Eclipse, IntelliJ and version control systems like GitHub and
Subversion and using logging libraries like Log4j.
 Good working experience on databases like Oracle, MySQL, PostgreSQL, MongoDB
 Hands on experience in developing applications for IBM MQ, Apache Kafka
 Proficient in unit testing frameworks like Junit, Spring Test, Mockito,Pytest
 Expertise in using data interchange formats like JSON and XML.
 Proficient in implementing and managing AWS messaging services like Amazon SQS for reliable, scalable, and
distributed message queuing systems.
 Experienced in decoupling microservices using Amazon SQS to ensure seamless, asynchronous communication
with fault tolerance and scalability.
 Expertise in Amazon SNS for real-time message broadcasting, building pub/sub architectures,
 Hands on experience on Python scripting ,Django, FastAPI
 Hands on experience in migrating the APIs to GraphQL
 Worked on performance and cost-efficient solutions in AWS cloud to suit application specific needs.
 Experience in handling messaging services using Apache Kafka
 Proficient in designing and deploying scalable, distributed streaming platforms using Apache Kafka.
 Expertise in real-time data processing and building highly available, fault-tolerant Kafka clusters.
 Hands-on experience with Kafka brokers, Zookeeper, Kafka Connect, and Kafka Streams for data ingestion,
processing, and routing.
 Skilled in event-driven architectures, ensuring high throughput, low-latency messaging, and data stream
management.
 Optimized Kafka clusters for performance, including tuning configurations, partition management, and
consumer lag monitoring.
 Experienced in integrating Kafka with AWS services, microservices, Spark and other big data tools.
 Expertise in using python libraries numpy, pandas ,matplotlib , Scikit-Learn
 Having strong domain knowledge on Banking ,Finance and Aeronautical domain
 Implemented Splunk and Kibana and log management with ELK.
 Implemented security to RESTful web services using JWT, OAUTH 2.0.
 Hands on experience on Azure cloud, used services like Blob storage, virtual machine, Azure SQL and Azure functions
 Worked on DevOps practices like CI/CD, Git, Jenkins, Artifactory and Udeploy
 Hands on experience in Terraforms and AWS cloud formation templates
 Developed and maintained Java-based web applications using OpenShift platform.
 Proficient in containerization technologies such as Docker, Kubernetes, and OpenShift
 Experience with Test Driven Development (TDD) and Behavior Driver Development (BDD) and refactoring code.
 Good experience in writing Joins, views, store procedures, functions, triggers, cursors, Collections using SQL &PL/SQL
 Strong debugging and problem-solving skills with excellent understanding of system development methodologies,
techniques and tools.

Certifications:
 Oracle Certified Java Programmer
 Six sigma greenbelt certified
 Azure AZ900 certified

Technical Skills: Java 1.6, Java 1.7, Java 1.8, J2EE, Spring 4 & 5, Spring Boot 1.6 to 3.1, Spring Data JPA, Python,
Languages/frameworks RESTful Web Services, JAXB, Spring JDBC, Apache Commons Logging, XML, JSON, Junit5,
Design Patterns |Spring WebFlux |Scalable Applications| Microservices Architecture |
REST API |Resiliency patterns| Enterprise App| GraphQL| Cucumber
Front-end React| JavaScript| Node JS|JSP| HTML| CSS | JQuery

Back-end Java| Python| PL/SQL


Build Tool Maven | Gradle
Databases Oracle| MySQL| MangoDB |PostgreSQL
Queues Kafka
CI/CD Docker | GitHub |Kubernetes | Ansible| OpenShift
App Servers Apache Tomcat | JBoss Application Servers
Testing Framework JUNIT| Mockito| Pytest
Tools/Methodologies Agile-Scrum| Waterfall
Cloud Infrastructure AWS|Azure
Debugging Tools Kibana| Splunk
Cloud Technologies & EC2|EBS| S3|ECS| EKS| SQS|SNS|RDS|S3|Glacier| Lambda| Kubernates| Docker|
DevOps Tools OpenShift| Vault| Ansible| Terraform| Cloud formation template

Professional Experience:
Client : AARP (American Association of Retired Persons), Malvern, PA
Role : Senior Software Engineer
Duration : September 2024 to till date
Description:
Event and Engagement Management Clearinghouse (EEMC):
EEMC is the System of Record and central data repository to collect, store, and process event and engagement data.

EEMC main goal is to develop a better solution for managing AARP’s event data engagements. A repository has been
created that stores data coming from event platforms, cleanses it, and it transmits it to the backend systems of AARP.
EEMC is the system of record for all event and engagement data and process the data in near real time. Ultimately, any
event and engagement data that is received from any platform will be in EEMC

Responsibilities:
 Responsible for requirement analysis, design and documentation.
 Used spring boot framework for APIs development
 Used AWS lambda functions for fetching data from up streams based on time intervals
 Used DynamoDB as an intermediate repository for storing the data
 Developed the user interface using angular
 Involved in creating and implementing in custom fields using Rest APIs and PostgreSQL
 Developed database objects using RDBMS PostgreSQL
 Used Java features such as COLLECTIONS, EXCEPTIONAL HANDLING , STREAM APIs
 Tuned the performance of the complex queries and rewritten to improve the performance
 Used AWS services such as EC2, Step functions and SQL as part of the data pipeline
 Used Kafka for event messaging and publish the topics for reliable message delivery
 Used CI/CD tools such as Bit bucket repository, Jenkins for deployment
 Developed CI/CD pipelines using AWS CodePipeline, CodeBuild, and CodeDeploy, reducing deployment times and
ensuring faster release cycles.
 Involved in code reviews, test reviews, spring planning, grooming and retrospect
 Involved in system testing, and regression testing for newly Implemented applications databases.
 Designed data models, schemas, and relational table structures as per the client's requirements.
 Scheduled the AWS lambda functions in using AWS event bridge
 Used python for implementing the AWS lambda functions
 Used Amazon Web Services (Amazon EC2, Amazon S3, Amazon Simple DB, Amazon Elastic Load Balancing,
Amazon SQS, Amazon EBS).
 Used Terraforms and cloud formation templates for creating the services based on code as infrastructure

Environment:
 Spring Boot, Java8, Rest API, Angular, CloudWatch, Kafka, AWS, DynamoDB, PostgreSQL, Python, Postman

Client : LTIMindtree Ltd (HSBC bank), Irving, Texas


Role : Senior Software Engineer
Duration : August 2022 to August 2024
Description:
Migration of Mule APIs to Java &Python:
Mule to Kong migration is an HSBC client project where APIs from Mule soft are being migrated to Spring boot and Python
FAST APIs using the Kong framework.

HSBC uses the Mule APIs for performing various operations through their internal self-service portal, the current version of
Mule is 2.0 which needs to be upgraded to 3.0 which involves high cost in managing the license, in order to reduce the cost
involved in Mule licensing, HSBC decided to migrate APIs from Mule to Spring boot and FASTAPI using Kong framework.
Responsibilities:
 Responsible for requirement gathering, analysis and design and documentation.
 Used Java frameworks such as Spring MVC, Spring Boot
 Used Python framework FAST API
 Used React as a front end user interface
 Involved in migrating the APIs from Mule soft to RestAPIs and GraphQL APIs
 Involved in API migration using FAST API, JAVA8.0, SPRING Boot, and MICROSERVICES
 Involved in the development of database components such as Packages, Procedures, Functions, Trigger
 Used Django framework for rapid development of the applications.
 Tuned the performance of the complex queries using the Materialized views.
 Used CI/CD tools such as Jenkins, UDeploy, and Artifactory.
 Involved in automating the deployment process end-to-end.
 Involved in code reviews, test reviews, and stakeholder discussions.
 Involved in system testing, and regression testing for newly Implemented applications databases.
 Designed data models, schemas, and relational table structures as per the client's requirements.
 Designed Ansible playbooks.
 Involved in deploying the APIs to the IKP cluster using docker and Kubernates.
 Used Terraforms and cloud formation templates for creating the services based on code as infrastructure
 Proficient in containerization technologies such as Docker, Kubernates, and OpenShift
 Used the concepts of PL/SQL such as stored procedures, functions ,triggers, cursors and collections
Environment:
 Spring Boot, Java, Rest API, React, GraphQL, Splunk, Kafka, AWS, Micro Services, oracle SQL,PL/SQL,Python

Client : WellsFargo international (WellsFargo Bank), Charlotte, NC


Role : Senior software engineer
Duration : June 2020 to August 2022
Description:
Wallstreet suits and Simcorp Dimensions:
The objective of the Wallstreet Suits and simcorp application is to provide the retail customer's transaction information
to Upstream and downstream teams based on the criteria provided by the business team. These are the age-old
applications in WellsFargo that support numerous transactions.

Wallstreet Suits and SimCorp applications generate various reports pertaining to regular transactions done by its retail
customers, there are Daily, weekly, and monthly reports will be generated by these applications based on configuring the
activities in the front-end application. The core part of these applications resides in the database. The data in the database
are pumped up regularly from steams and required data will be supplied to the downstream applications.
Responsibilities:
 Responsible for requirements gathering, analysis and design and documentation.
 Involved In requirement gathering and requirement analysis by interacting with various teams
 Used React as a front end user interface
 Implemented new features in the applications application using Java8, oracle database
 Streamlined the application by identifying and fixing the bugs in the application
 Improved the performance of the application by enhancing the existing code using java collections
 Migrated the applications from Java MVC to Spring boot frame work
 Used SonarQube to measure both class level & method level code quality.
 Written new stored procedures and functions for improved data collection and processing
 Worked on python’s Django framework for developing a test automation tool
 Written test cases using TDD and BDD test methodologies
 Production support during application release
 Worked on building the release pipeline end to end using Jenkins, Udeploy and Artifactory
 Migrated 100+ Perl scripts to python
 Actively participated in stakeholder meeting, scrum and status calls
Environment:
 Spring Boot, Java, Rest API, React, GraphQL, Splunk, Kafka, AWS, Micro Services, oracle SQL,PL/SQL, Python.

Client : ADP Inc, Roseland, NJ


Role : Senior Member Technical
Duration : January 2019 to May 2020
Description:
WorkForce Now (WFN)
WFN (Work Force Now) is a product owned by ADP for 15 years, WFN provides end-to-end Services of payroll and HR
required for an organization. WFN is currently used by 90% of fortune 500 companies across the world as part of their
payroll and HR.

ADP Workforce Now (WFN) delivers the versatility of an easy-to-use solution that scales with an organization and the
convenience of an all-in-one platform that seamlessly integrates with all our favorite systems. WFN does accurate pay,
deductions and tax filing, online reporting, analytics, Built-in alerts, and calculations. It provides Regulatory and compliance
support for peace of mind.

Responsibilities:
 Responsible for requirements gathering, analysis and design and documentation.
 Involved in writing various batch jobs for automating the payroll process
 Developed application using Java frameworks such as Spring MVC, Spring Boot
 Involved in data preparation for Data science team for their application processing
 Involved in development of database components such as Packages, Procedures, Functions, Trigger
 Used Java features such as COLLECTIONS, EXCEPTIONAL HANDLING, MULTI THREADING.
 Tuned the performance of the complex queries using the Materialized views.
 Supported various payroll applications, bug fixing and production support
 Involved in system testing, regression testing for newly Implemented applications databases.
 Involved in enhancing the application by identifying the bugs, proposed changes
 Developed various payroll components using cutting edge technologies
 Worked on data analysis by pulling the data from AWS cloud
Environment:
 Spring Boot, Java, Rest services, Micro Services, AWS,SQL, JavaScript, HTML, CSS, React, Log4j, IntelliJ.

Client : Honeywell international Inc, Phoenix, AZ


Role : Senior Engineer
Duration : January 2018 to December 2018

Core Processing tool is a product that produces various navigational databases for Aircraft or helicopters which helps the
aircraft or helicopter in flying from one area to another area using different waypoints, Navigational databases show
different obstacles and paths to the pilot while moving the aircraft.
Responsibilities:
 Responsible for requirements gathering, analysis and design and documentation as the application was
started from scratch.
 Onsite production support
 Working closely with onsite team and requirement gathering from production team
 Testing the navigational databases on test benches located in onsite area
 Working closely between onsite and offshore team , acting like bridge between the teams
Environment: Core-java, PL/SQL, SQL, java script, Unix, Azure, shell scripting, Spring MVC, Boot and Restful.

Client : Honeywell international, India


Role : Senior Engineer
Duration : August 2015 to January 2018
Description:
CORE PROCESSING TOOL:
Core Processing tool is a product that produces various navigational databases for Aircraft or helicopters which helps the
aircraft or helicopter in flying from one area to another area using different waypoints, Navigational databases show
different obstacles and paths to the pilot while moving the aircraft.
Responsibilities:
 Responsible for requirements gathering, analysis and design and documentation as the application was
started from scratch.
 Provided technical direction and system architecture expertise.
 Involved in various Aerospace applications
 Implemented various models to support latest Flight management systems
 Involved in six sigma trainings and obtained green belt certificate
 Used Azure services Blob storage, Virtual Machines, Azure SQL
 Used Azure SQL from pulling the data and prepared the reports
 Developed various web pages using HTML and JSP as part of day to day job routine
 Improved the performance of application using java collection framework
 Involved in customizing the application based on stakeholder requirement
 Used JSP, Java script and JQuery for front end application development
 Involved in development of database components such as Packages, Procedures, Functions, Trigger
 Used Java features such as COLLECTIONS, EXCEPTIONAL HANDLING, MULTI THREADING.
 Tuned the performance of the complex queries using the Materialized views.
 Supported various Aerospace applications, bug fixing and production support
 Involved in system testing, regression testing and sanity testing
 Worked on various components of flight management systems
Environment: Spring MVC, Spring Boot, Restful, Core-java, PL/SQL, SQL, JSP, JQuery, Java Script, Azure, Unix, shell
scripting

Client : MagnaQuest technologies, India


Role : Software developer
Duration : December 2013 to August 2015
Description:
BILLING AND CRM (branding as SURE!)
Billing and CRM is a product that provides end-to-end services in sectors like Pay TV, VOIP, ISP, SaaS, Triple Pay TV, and
subscription-based services. The product intelligently manages the subscription life cycle such as customer creation, giving
orders, provisioning, invoicing, payment processing, collection follow-ups, payments, and handling service requests, etc.
Responsibilities:
 Responsible for requirements gathering, analysis and design and documentation as the application was
started from scratch.
 Involved in creating database tables, schemas and DBA activities
 Production support during application release
 Developed the various reports using tool such crystal reports and power BI
 Involved in development of database components such as Packages, Procedures, Functions, Trigger
 Customized the application based on the client requirement
 Used Java features Oops concepts, collections, Exception handling , memory management
 Tuned the performance of the complex queries using the Materialized views.
 Supported various modules in the Billing and CRM domain
 Worked on crystal reports
 Developed the shell scripts for automating the jobs using CRONTAB
 On call support whenever required
Environment:
 Core Java, Oracle SQL, PL/SQL, Crystal reports, Unix, and shell scripting

Education:

 JNTU UNIVERSITY- India | B.Tech in IT

You might also like