Responsibilities
Imagine having the opportunity to join an organization that is committed to your professional growth and development. At Bell, you will be a part of our amazing team where you will have an opportunity to work with other like-minded and passionate individuals in an environment where it is fun, fast paced, exciting and ever changing.
You will have an opportunity to:
Support data pipelines within and BI Hadoop production environments
Support and troubleshoot technical problems within BI GCP and BI Hadoop environments
Share technical expertise with team and provide recommendations for best practices.
Support production processes within BI teams
Support outages, delays & major deployments within the BI environment.
Qualifications
You will be a great fit on our team if...
University Degree in Software/Computer Engineering, Computer Science
3+ years of industry experience working with complex ETLs and relational databases.
Hands-on experiences working with GCP technologies like Cloud DataFlow, Cloud DataProc, Cloud Composer, Cloud DataFusion.
Big Data and unstructured data technologies including HDFS, Spark, Map Reduce, Hive, Impala, Cloudera, NoSQL, Oozie, Sqoop
Advanced knowledge and experience with SQL and relational databases such as Teradata, ORACLE, SQL Server.
Have experience working with Python and Java.
Demonstrated analytical and problem-solving skills and experience in developing
Big Data ingestion frameworks or experience in working with data ingestion tools.
Able to troubleshoot code written by other people, including understanding the business logic implemented and ensuring the code is optimal.
Able to prioritize, multitask and execute tasks in a high-pressure environment.
Strong verbal and written communication skills is required.
Collaborative spirit to share learnings and knowledge with peers.
Qualification/ Requirements:
Experience with
Big Data and unstructured data technologies including HDFS, Spark, Map Reduce, Hive, Impala, Cloudera, NoSQL, Oozie, Sqoop.
Have familiarity with Google technologies and have experience working on the framework.
Have a degree in Computer Science, Linguistics, or any similar programs.
Excellent communication skills
Proficient in MS Applications
Able to work in a night shift and flexible schedule.
Summary of role requirements:
Looking for candidates available to work on weekdays
2-3 years of relevant work experience required for this role
Working rights required for this role
Expected start date for role: 13 September 2023
Imagine having the opportunity to join an organization that is committed to your professional growth and development. At Bell, you will be a part of our amazing team where you will have an opportunity to work with other like-minded and passionate individuals in an environment where it is fun, fast paced, exciting and ever changing.
You will have an opportunity to:
Support data pipelines within and BI Hadoop production environments
Support and troubleshoot technical problems within BI GCP and BI Hadoop environments
Share technical expertise with team and provide recommendations for best practices.
Support production processes within BI teams
Support outages, delays & major deployments within the BI environment.
Qualifications
You will be a great fit on our team if...
University Degree in Software/Computer Engineering, Computer Science
3+ years of industry experience working with complex ETLs and relational databases.
Hands-on experiences working with GCP technologies like Cloud DataFlow, Cloud DataProc, Cloud Composer, Cloud DataFusion.
Big Data and unstructured data technologies including HDFS, Spark, Map Reduce, Hive, Impala, Cloudera, NoSQL, Oozie, Sqoop
Advanced knowledge and experience with SQL and relational databases such as Teradata, ORACLE, SQL Server.
Have experience working with Python and Java.
Demonstrated analytical and problem-solving skills and experience in developing
Big Data ingestion frameworks or experience in working with data ingestion tools.
Able to troubleshoot code written by other people, including understanding the business logic implemented and ensuring the code is optimal.
Able to prioritize, multitask and execute tasks in a high-pressure environment.
Strong verbal and written communication skills is required.
Collaborative spirit to share learnings and knowledge with peers.
Qualification/ Requirements:
Experience with
Big Data and unstructured data technologies including HDFS, Spark, Map Reduce, Hive, Impala, Cloudera, NoSQL, Oozie, Sqoop.
Have familiarity with Google technologies and have experience working on the framework.
Have a degree in Computer Science, Linguistics, or any similar programs.
Excellent communication skills
Proficient in MS Applications
Able to work in a night shift and flexible schedule.
Summary of role requirements:
Looking for candidates available to work on weekdays
2-3 years of relevant work experience required for this role
Working rights required for this role
Expected start date for role: 13 September 2023
Submit profile
NeksJob Philippines
About the company
NeksJob Philippines jobs
₱ 15,000.00 monthly · Batanes, Cagayan Valley · 14 September (updated)
Position Urgent Hiring: big data engineer recruited by the company NeksJob Philippines at , Joboko automatically collects the salary of , finds more jobs on Urgent Hiring: Big Data Engineer or NeksJob Philippines company in the links above
About the company
NeksJob Philippines jobs
₱ 15,000.00 monthly · Batanes, Cagayan Valley · 14 September (updated)







