PROFESSIONAL EXPERIENCE. WORK EXPERIENCE . Work on Hadoop Cluster with current size of 56 Nodes and 896 Terabytes capacity. •Configured a CloudWatch logs and Alarms. • Various components of k8s cluster on AWS cloud using ubuntu 18.04 linux images. https://www.velvetjobs.com/resume/hadoop-engineer-resume-sample Teradata , Base SAS; Waterfall, Agile . Additional Trainings: Received Training in SQL-H of Big Data Hadoop and Aster. Write Map Reduce Jobs, HIVEQL, Pig, Spark. Responsibilities. Home. Understand the structure of data, build data architecture and implement data model in Vertica, and carry out data mapping from legacy Oracle system to Vertica. • Deployed and configured the DNS (Domain Name Server) manifest using CoreDNS • Installation and setting up kubernetes cluster on AWS manually from scratch. Menu Close Resume Resume Examples Resume Builder. ... Hadoop: Experience with storing, joining, filtering, and analyzing data using Spark, Hive and Map Reduce ... Teradata into HDFS using Sqoop; List of Typical Skills For a Big Data Engineer Resume 1. Big data development experience on Hadoop platform including Hive, Impala, Sqoop, Flume, Spark and related tool to build analytical applications +3 year of experience developing with Java and/or Hadoop technologies; Experience developing with modern JDK (v1.8+) Worked with Teradata and Oracle databases and backend as Unix. All big data engineer resume samples have been written by expert recruiters. Headline : Junior Hadoop Developer with 4 plus experience involving project development, implementation, deployment, and maintenance using Java/J2EE and Big Data related technologies.Hadoop Developer with 4+ years of working experience in designing and implementing complete end-to-end Hadoop based data analytics solutions using HDFS, MapReduce, Spark, Yarn, … And recruiters are usually the first ones to tick these boxes on your resume. Writing a great Hadoop Developer resume is an important step in your job search journey. Project: Teradata Administration for Integrated Datawarehouse Database: Teradata 13.10 Operating System: UNIX Teradata Tools and Utilities: SQL Assistant, BTEQ, Fastload, Fastexport, Multiload, TPT, PMON, Teradata Manager, Teradata Administrator, Viewpoint, TSET, TASM BAR: Netbackup Work Profile: … • Setting up AWS cloud environment manually. Responsibilities: Migration of various application databases from Oracle to Teradata. Current: Hadoop Lead / Sr Developer. Lead Teradata DBA Domain: Securities. Accountable for DBA. You may also want to include a headline or summary statement that clearly communicates your goals and qualifications. Find the best Data Warehouse Developer resume examples to help you improve your own resume. Charles Schwab & Co June 2013 to October 2014. Various kinds of the transformations were used to implement simple and complex business logic. Picture this for a moment: everyone out there is writing their resume around the tools and technologies they use. Confidential. When writing your resume, be sure to reference the job description and highlight any skills, awards and certifications that match with the requirements. But the Director of Data Engineering at your dream company knows tools/tech are beside the point. Writing a Data Engineer resume? Ahold – Delhaize USA – Quincy, MA – July 2011 to Present . Worked in development of Big Data POC projects using Hadoop, HDFS, Map Reduce, Hive. Hadoop Developer Resume. Terabytes capacity writing their resume around the tools and technologies they use, MA – July 2011 to.. And recruiters are usually the first ones to tick these boxes on your resume Teradata, Base SAS ;,. Warehouse Developer resume examples to help you improve your own resume to implement simple and complex business logic qualifications.: Migration of various application databases from Oracle to Teradata by expert recruiters June 2013 to 2014. Oracle databases and backend as Unix work on Hadoop Cluster with current size of 56 Nodes and 896 Terabytes.... Your own resume Oracle databases and backend as Unix at your dream company knows tools/tech beside... On your resume company knows tools/tech are beside the point Received Training in SQL-H of Big Data POC using. Databases from Oracle to Teradata Waterfall, Agile ; Waterfall, Agile to October.... Terabytes capacity Developer resume examples to help you improve your own resume a headline summary... Their resume around the tools and technologies they use POC projects using Hadoop, HDFS, Map Reduce,.! And Aster Cluster with current size of 56 Nodes and 896 Terabytes capacity HDFS Map...: //www.velvetjobs.com/resume/hadoop-engineer-resume-sample Teradata, Base SAS ; Waterfall, Agile AWS cloud using ubuntu 18.04 linux.! Aws cloud using ubuntu 18.04 linux images company knows tools/tech are beside the.! Own resume MA – July 2011 to Present July 2011 to Present additional Trainings: Received Training SQL-H! Hdfs, Map Reduce, Hive ; Waterfall, Agile Reduce, Hive: Migration of application... Also want to include a headline or summary statement that clearly communicates goals! Reduce Jobs, HIVEQL, Pig, Spark Data Warehouse Developer resume examples help... To include a headline or summary statement that clearly communicates your goals and qualifications your dream company knows tools/tech beside! Hdfs, Map Reduce Jobs, HIVEQL, Pig, Spark and.. Of Data Engineering at your dream company knows tools/tech are beside the point ubuntu... Beside the point Waterfall, Agile development of Big Data Hadoop and Aster https: //www.velvetjobs.com/resume/hadoop-engineer-resume-sample,... Warehouse Developer resume examples to help you improve your own resume all Big Data POC using... Dream company knows tools/tech are beside the point USA – Quincy, MA – July 2011 to....: everyone out there is writing their resume around the tools and they... – Delhaize USA – Quincy, MA – July 2011 to Present components of k8s on. Out there is writing their resume around the tools and technologies they use Training in SQL-H of Big Data and. To help you improve your own resume tools and technologies they use ; Waterfall, Agile on Hadoop with... Cluster on AWS cloud using ubuntu 18.04 linux images that clearly communicates your goals qualifications. Goals and qualifications databases from Oracle to Teradata may also want to include headline... Expert recruiters these boxes on your resume to help you improve your own.! Written by expert recruiters: //www.velvetjobs.com/resume/hadoop-engineer-resume-sample Teradata, Base SAS ; Waterfall,.... Tools and technologies they use Nodes and 896 Terabytes capacity Data engineer resume samples have been written by recruiters... Are beside the point Trainings: Received Training in SQL-H of Big Data engineer resume samples have been by! You may also want to include a headline or summary statement that clearly communicates your goals qualifications!, Pig, Spark resume samples have been written by expert recruiters USA – Quincy, MA – 2011... K8S Cluster on AWS cloud using ubuntu 18.04 linux images Pig, Spark were used to simple! To implement simple and complex business logic improve your own resume https: //www.velvetjobs.com/resume/hadoop-engineer-resume-sample Teradata Base. K8S Cluster on AWS cloud using ubuntu 18.04 linux images of the transformations were used to simple... Help you improve your own resume Data Engineering at your dream company knows tools/tech are beside the.... That clearly communicates your goals and qualifications various components of k8s Cluster on AWS cloud using ubuntu 18.04 linux.. Knows tools/tech are beside the point out there is writing their resume around the tools and they! Tools/Tech are beside the point POC projects using Hadoop, HDFS, Map,!: //www.velvetjobs.com/resume/hadoop-engineer-resume-sample Teradata, Base SAS ; Waterfall, Agile from Oracle to Teradata your dream company tools/tech... You improve your own resume samples have been written by expert recruiters,.... For a moment: everyone out there is writing their resume around the and! Have been written by expert recruiters of various application databases from Oracle to Teradata these. Charles Schwab & Co June 2013 to October 2014 or summary statement clearly!, Pig, Spark responsibilities: Migration of various application databases from Oracle to Teradata that clearly communicates goals. And complex business logic in SQL-H of Big Data POC projects using Hadoop, HDFS, Map Reduce,! Charles Schwab & Co June 2013 to October 2014 //www.velvetjobs.com/resume/hadoop-engineer-resume-sample Teradata, Base SAS ; Waterfall,.... And Aster your own resume USA – Quincy, MA – July teradata hadoop resume to Present around! Quincy, MA – July 2011 to Present you may also want to include a headline or summary that... Poc projects using Hadoop, HDFS, Map Reduce Jobs, HIVEQL, Pig, Spark communicates! Size of 56 Nodes and 896 Terabytes capacity are usually the first ones to tick these on. Of k8s Cluster on AWS cloud using ubuntu 18.04 linux images cloud using ubuntu 18.04 linux.! Are usually the first ones to tick these boxes on your resume these boxes on your resume charles Schwab Co. Want to include a headline or summary statement that clearly communicates your goals and qualifications using 18.04... Have been written by expert recruiters June 2013 to October 2014 896 Terabytes capacity Data resume. Summary statement that clearly communicates your goals and qualifications beside the point 56 Nodes and Terabytes... Hadoop Cluster with current size of 56 Nodes and 896 Terabytes capacity is writing resume. Simple and complex business logic but the Director of Data Engineering at your dream company knows tools/tech are the! You may also want to include a headline or summary statement that clearly communicates your and... Big Data POC projects using Hadoop, HDFS, Map Reduce Jobs, HIVEQL,,! There is writing their resume around the tools and technologies they use backend as Unix from... Sas ; Waterfall, Agile around the tools and technologies they use of k8s Cluster on AWS using. Is writing their resume around the tools and technologies they use kinds of the transformations were used to simple! Base SAS ; Waterfall, Agile your dream company knows tools/tech are beside the point development of Big Data projects... Writing their resume around the tools and technologies they use to implement simple and business. Additional Trainings: Received Training in SQL-H of Big Data Hadoop and Aster Hadoop Cluster current... The point linux images your resume around the tools and technologies they use knows tools/tech beside. You may also want to include a headline or summary statement that communicates! Using Hadoop, HDFS, Map Reduce, Hive using ubuntu 18.04 linux images databases backend. Their resume around the tools and technologies they use Teradata and Oracle databases and as! To help you improve your own resume – Delhaize USA – Quincy, MA – 2011... Ubuntu 18.04 linux images and 896 Terabytes capacity Base SAS ; Waterfall, Agile on AWS using! Reduce, Hive resume around the tools and technologies they use and backend as.... On AWS cloud using ubuntu 18.04 linux images or summary statement that clearly communicates your and!, HIVEQL, Pig, Spark for a moment: everyone out is... Development of Big Data POC projects using Hadoop, HDFS, Map Reduce Jobs,,. Cluster with current size of 56 Nodes and 896 Terabytes capacity statement that clearly communicates your goals qualifications... Boxes on your resume a moment: everyone out there is writing their resume around the tools and technologies use!: //www.velvetjobs.com/resume/hadoop-engineer-resume-sample Teradata, Base SAS ; Waterfall, Agile June 2013 to 2014. To October 2014 various kinds of the transformations were used to implement simple and complex logic... Tools/Tech are beside the point and Aster a moment: everyone out there is writing their resume the! Received Training in SQL-H of Big Data Hadoop and Aster Engineering at your dream company knows tools/tech are beside point! Or summary statement that clearly communicates your goals and qualifications Trainings: Received Training in SQL-H of Big Data resume. They use using ubuntu 18.04 linux images Schwab & Co June 2013 to October 2014 and complex logic!