Find Jobs
Categories
All Categories (19)
Web, Mobile & IT (16)
Business & Consulting (1)
Design & Creative (1)
Engineering (1)
Big Data

Remote Big Data Jobs

Find Jobs
Popular skills: JavaScript, Java, Python, AWS
19 jobs
User avatar
Contract Remote
United States
$50 - $75 per hour 21 days ago
It's a long-term / ongoing project. Requirements: The candidate will work closely with development teams to plan, deploy, and support multiple applications across our working environments: - Demonstrate extensive abilities and a proven record of success automating workflows and identifying process improvements - Demonstrate a desire to keep learning, maintain own skillset, stay up to date and expand one's knowledge across the full stack - Contribute to thought leadership through participation in the development of technology processes - Comfortable learning and implementing new technologies and providing documentation where needed. The candidate should have at least 3+ years of experience working on cloud platforms (as a DevOps engineer or software engineer) as well as experience in the following areas: * Microsoft Certified Azure Solutions Architect Expert (preferred), Google Cloud Platform * Qualified Developer or Amazon Web Services certifications (Solutions Architect or Developer) * Experience with Git, code management, branching strategies * Prior development experience preferred, ability to write and understand shell/bash scripts * Creating and working with Dockerfiles, and container images * Monitoring/operation tools (Datadog, Application Insights, Splunk, ELK stack) * Instrumenting, profiling and performance tools (OpenTelemetry, JMeter, New Relic, App Dynamic) * Linux Administration, including network and software application configuration * Configuration Management tools (Terraform, Chef, Puppet, Ansible) * Experience with message queues (RabbitMQ, Kafka) * Experience with cloud storage solutions (blob storage, file shares). * Ability to identify and configure optimal data storage solution for a given situation * Documenting systems and networks, refining requirements, self-identify solutions and communicating to the team Networking: Experience managing and planning network architecture – v-nets, subnets, peering, VPNs Experience debugging networking issues (within cloud environment as well as within a kubernetes cluster) Experience configuring firewalls (on cloud platform and on ubuntu servers) NGINX configuration, TLS termination, working with certificates for TLS/SSL Kubernetes: Creating, managing, and monitoring a kubernetes cluster Helm deployments, configMaps, and stateful sets Managing resource requests and limits Experience scaling a cluster and knowledge of techniques to manage load on cluster Deployment Pipelines: Experience building and maintaining CI/CD pipelines (GitHub Actions knowledge is preferable) IAM: Experience creating and managing roles and permissions for Users, Groups, and Application Identities (Service principals in Azure) Bonus (nice to have): AI/Machine Learning experience (Tensor Flow) Neo4j – Graph Database Big Data, data analysis experience
User avatar
Full Time Remote
United States
28 days ago
It's fun to work in a company where people truly BELIEVE in what they're doing! We're committed to bringing passion and customer focus to the business. Position Summary: We are seeking an experienced Data Enterprise Architect to join our Product Engineering team to develop a vision, build the strategy and set an implementation roadmap for data engineering enterprise-wide. An ideal candidate will be a thought leader and knows to balance short-term and long-term priorities against the data and infrastructure needs for actionable insights across NAVEX Global. This position will combine into our architecture team and be a master in cross-functional collaboration by developing deep relationships with key partners across the company. You will understand requirements, evaluate design alternatives and architect complex solutions by coordinating with a large team of interdisciplinary engineers from multiple product teams with varied data requirements. We Offer You: A collaborative, thriving organization with constant growth potential and strong brand recognition An inclusive work environment centered around personal growth, career development and mentors dedicated to your success at every level Competitive pay and benefits that matter, including the time and flexibility for a balanced lifestyle What You Will Do: Design and document enterprise-wide data architectures, roadmaps and frameworks Work with data engineers to perform source system analysis, identify key data issues, data profiling and development of normalized and star-snowflake physical schemas Orchestrate data ingestion into Data Warehouses, Data Lakes, and/or Data Lake houses Design and implement data pipelines for processing and aggregate data Provision resources, monitor pipelines, adjust pipelines, and performing testing and quality control Work with business users, gathering business requirements and mapping business requirements to data representations Design secure data infrastructure and processes, performing data security, penetration testing, Separation of Duties (SoD), and security control and audits Develop strategies for data acquisitions, archive recovery and implementation of a database Actively manage the evaluation, selection and application of data architecture components to create an integrated portfolio of applications which deliver maximum business value in line with enterprise strategy and standards Support strategic approaches and maintain alignment of development projects with enterprise strategic direction Investigate, analyze and make recommendations to management regarding technology improvements, upgrades and modifications Apply design alternatives and concepts: layered architectures, components, interfaces, messaging and patterns for enterprise solutions What You Will Need: A Bachelor’s degree in Computer Science, a similar technical discipline or equivalent experience 8+ years of data engineering experience and 5+ years’ experience architecting, developing, deploying and supporting complex data applications in an enterprise environment Expert knowledge of data management systems, practices and standards Expert analytical and design skills, including the ability to abstract information requirements from real-world processes to understand information flows in computer systems Expertise in the fields of data quality, data profiling, data security, Master Data concepts and data migration Experience in data modeling, schema design and tradeoffs with the ability to prepare data models at varying levels of detail, such as broad conceptual and planning models as well as detailed logical designs Knowledge of and experience with SQL/NoSQL databases and MySQL Experience bridging multiple data repositories, performance and query tuning Experience in designing, building and maintaining data processing infrastructure and flexible data representations Experience in designing distributed systems Experience with data cleansing, data enrichment, batch and streaming, data transformation and connecting to new data sources Knowledge of message oriented architecture, Apache Sparks on Hadoop, Kafka, Pulsar or other Data and event streaming technologies Excellent verbal and written communication skills and a commitment to engage collaborate with people across a variety of levels with diverse backgrounds NAVEX Global is an equal opportunity employer, including disability/vets. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!
User avatar
Contract Remote during COVID-19
Toronto, ON, Canada
$30 - $55 per hour 2 months ago
minimum 3+ Years working experience in Testing Must have valid work permit for Canada (PR or Citizen) we have multiple Testing Open Positions in Toronto Test Manager Manual Test Analyst Big Data Test Analyst Test Lead - Automation Automation Tester Performance test Lead TDM (Test Data Management) Test Analyst - Mobile Testing Test Analyst - Accessiblity Testing Automation Tester Please Share me your updated resume at abhinay.s@idctechnologies.com or call me at 437-370-3091 https://www.linkedin.com/in/abhinay-singh-03b613ba/ if someone in your network like in your friends or colleagues is available for new positions , you also could recommend , Thanks Abhinay Singh-IDC Technologies INC
User avatar
Contract Non-Remote
Sunnyvale, CA, United States
2 months ago
10+ years of experience building distributed scalable back services heavily relying on distributed messaging, storage, and compute Expert knowledge and extensive hands-on experience with Java 11+ (including streams and reactive) and Mongo (including very complex aggregation queries) Hands-on experience with Spring, Spark, Docker, and Kubernetes Strong understanding and experience with CI/CD pipelines, unit and integration testing, containerization, monitoring and alerting, production logs debugging Strong collaboration skills, system thinking and ability to clearly explain complex concepts Kafka, Cassandra, Solr, HDFS, Scala experience is a plus
User avatar
Freelance Remote
Anywhere
2 months ago
We are looking for a Business Analyst who will be the vital link between our information technology capacity and our business objectives by supporting and ensuring the successful completion of analytical, building, testing and deployment tasks of our software product’s features. Responsibilities: -Define configuration specifications and business analysis requirements -Perform quality assurance -Define reporting and alerting requirements -Own and develop relationship with partners, working with them to optimize and enhance our integration. -Help design, document and maintain system processes -Report on common sources of technical issues or questions and make recommendations to product team. -Communicate key insights and findings to product team -Constantly be on the lookout for ways to improve monitoring, discover issues and deliver better value to the customer. Requirements: -Previous experience in Business / Systems Analysis or Quality Assurance -A degree in IT / Computer Science -Proven experience in eliciting requirements and testing -Experience in analyzing data to draw business-relevant conclusions and in data visualization techniques and tools -Solid experience in writing SQL queries -Basic knowledge in generating process documentation -Strong written and verbal communication skills including technical writing skills
User avatar
Full Time Remote
Europe
3 months ago
Our mission at Fidel API is to unlock the full potential of payment cards. We offer a suite of financial infrastructure APIs that enable developers to build programmable experiences connected to purchases made in real-time using a card. Our tools are transforming how merchants and users interact by powering real-time, event-driven engagements, best-in-class loyalty and rewards programs and revolutionizing processes from reimbursements to expense management. Our APIs are used by start-ups through global enterprises including Google, British Airways, TopCashback, Perkbox, Royal Bank of Canada, and Blackhawk Networks, and are supported by the world’s largest card networks, including Visa, Mastercard and American Express. Launched in 2018, Fidel is headquartered in London, with offices in Lisbon, New York, and remote employees globally. Fidel is backed by investors including Nyca Partners, QED Investors, Citi Ventures, RBC Capital and Commerce Ventures. We're in an incredibly exciting period of growth as we continue to scale internationally and are looking for an experienced Data Engineer who wants to be part of this journey. What you'll do: You'll advocate for data within the engineering and broader organisation; You'll play an integral part in building and growing our Data team (analysts, etc); You'll take raw data from different sources to create intuitive, scalable data models using the tools that you see best fit for the task; You'll work cross-functionally with product, the rest of engineering, and other company members to understand the problems internal and external stakeholders are facing, with the goal of designing and implementing solutions to them; As a data engineer, you'll architect and build a data platform that will help power Fidel's data insights; We want you, if you: You have 3+ years of professional experience as a Data Engineer; You have proven experience working in AWS environments with data-related services(DynamoDB, Redshift, RDS, Glue, Athena); You have proven knowledge of Python and/or JavaScript, SQL and knowledge and familiarity with other data-oriented formats, languages and frameworks; You are familiar with API integrations; You have proven experience working in fast paced environments, preferably in cross-functional teams BENEFITS We're committed to making Fidel a fantastic place to work and we go to great lengths to give you what you need to succeed. You’ll receive: Macbook laptop and other setup equipment Flexible working - opportunity to work from home when you need to Health Insurance Unlimited holidays (you manage your time) Annual company off-site (Europe) A fully stocked kitchen with unlimited snacks & refreshments Friday team lunch & drinks OUR VALUES At Fidel API, we live by our values and what we stand for, and that feeds into every decision we make. Fidel comes from the Latin word ‘Fidelis’ which means reliability, trust, truth and dependability. We honor those values — and our commitment to them — by naming ourselves after the ancient root word itself. Across our company, we speak 27 languages and represent 25+ different nationalities. It’s our diversity of background, thinking, talents and skills that allows us to build truly global products for the developers who are driving payments innovation forward. APPLICATION At Fidel, we don’t just accept difference - we celebrate it, we support it, and we thrive on it. We’re proud to be an equal opportunity employer and we value diversity. We do not discriminate on the basis of educational attainment, race, religion, colour, national origin, gender, sexual orientation, age, marital status, veteran status or disability status - simply, we consider all qualified applicants, consistent with any legal requirements. If you have a disability or special need that requires accommodation, please let us know. To learn more about us and what life is like at Fidel, visit our blog or follow us on Twitter(@fidelhq) or Instagram(@fidelhq). If you think you’d be a great fit, apply today!
User avatar
Contract Remote during COVID-19
Long Beach, CA, United States
21 days ago
Hope you are doing well. We are hiring for Job opportunity for Role: Big Data Sr.Developer Location: Long Beach - CA USA, CLT Long Beach CA 90801 United Roles & Responsibilities : Strong knowledge on healthcare payer domain experience Strong knowledge on Big data technologies Apache Spark Scala Hive Impala HUE Hadoop and Cloudera experience Experience in Agile projects would be an added advantage Experience in GitHub / Devops The associate would need to perform the role of a technical lead/senior developer at offshore as well as coordinate with onsite for delivery. Should have strong communication skills (verbal and written) and be able to interpret business requirements quickly and come up with the technical design. Strong Healthcare background will be a value add to the profile especially on Payer Medicare and Medicaid data. Certification(s) Required : Any Big Data certifications preferable Must Have Skills: Apache Spark Scala Good To Have Skills Spark SparkSQL Hive
User avatar
Contract Remote
Anywhere
27 days ago
The client is a small, fast-growing, and remote-first company, so you'll likely get experience on many different projects across the organization. That said, here are some things you'll focus on: ● Work on CI/CD. ● Help scale a fast-growing and unique system. ● Monitor and improve infrastructure for the Client and our customers. ● Creating automation in the platform. ● Improve our developer platform - directly impact the way developers integrate the end client into their IT environments. ● This is a startup so engineering innovations can change
User avatar
Freelance Remote
Vietnam
2 months ago
A. Data Architecture - Deliver functionality required for business and data analysts, data scientists, and other business roles to advance the overall analytic performance and strategy of the bank - Build the best practices and strategies for data infrastructure to fulfill data analytic and utilization needs of the business with emerging latest technologies and capabilities. - Guide the team or teams in identifying opportunities to manage data and provide solutions for complex data feeds within the bank. - Evaluate various data architectures in the bank and utilize them to develop data solutions to meet business requirements. - Drive the delivery of data products and services into systems and business processes in compliance with internal regulatory requirements. Oversee the review of internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. B. Data Integration - Strategically obtain and integrate data and information from various sources into the firm’s platforms, solutions, and statistical models. - Lead a discussion with Data Scientists to understand the data requirements and create re-usable data assets to enable data scientists to build and deploy machine learning models faster. - Design, build and maintain optimized data pipelines and ETL solutions as business support tools in providing analysis and real-time analytics platform for critical decision making. Ensure data assets are organized and stored in an efficient way so that information is high quality, reliable, flexible, and efficient. C. Project Management - Manage project conflicts, challenges and dynamic business requirements to keep operations running at high performance. - Work with team leads to resolve people problems and project roadblocks, conduct post mortem and root cause analyses to improve practices for maximum productivity. Kỹ năng và kinh nghiệm cần thiết - Bachelor's or Master’s degree in Statistics, Mathematics, Quantitative Analysis, Computer Science, Software Engineering or Information Technology - 12 to 15 years of relevant experience with developing, debugging, scripting and employing big data technologies (e.g. Hadoop, Spark, Flink, Kafka, Arrow, Tableau), database technologies (e.g. SQL, NoSQL, Graph databases), and programming languages (e.g. Python, R, Scala, Java, Rust, Kotlin) with preference towards functional/trait oriented, including 5+ years of equivalent managerial roles - Good English communication (because working directly with Department Head who is a foreigner) - Deep experience in designing and building dimensional data models, ETL processes, applied data warehouse concepts and methodologies, optimized data pipelines and wore the architect hat in the past or worked with one extensively - Deep experience with monitoring complex system and solving data and systems issues having a consistent and algorithmic approach to resolving them Tại sao ứng viên nên làm ở đây We always strive to attract, develop, engage and retain employees with our best. We believe "people" is the most valuable resource so that we provide all eligible team members with a comprehensive set of benefits designed to protect their physical and financial health. It includes but not limited to: - Health Plans; - Financial wellness; - Paid time off; - Flexible schedule; - Engagement activities; - Modern facilities You will grow with Techcombank by having the opportunity to learn from top experts from across the world. Techcombank provides a rewarding remuneration structure that commensurates with your achievement and contribution; Techcombank is the Top 2 Best place to work in the banking industry where you can experience various exciting activities throughout the year: - Company anniversary, - Team building, - Active Saturday Off, - Year-End Party, etc.
User avatar
Freelance Remote
Indonesia
2 months ago
Tổng quan về công việc & trách nhiệm Building a truly data-driven company by applying your skills and knowledge to cracking tough data challenges. You will be actively driving this vision by: ● Collect, transform and model data from various sources ● Build predictive models to help us figure out what our users need ● Design and develop machine learning applications to improve our products and boost user growth ● Guide experimentation to test new product features and growth initiatives Yêu cầu bắt buộc ● At least Bachelor Degree in a quantitative field (i.e. Mathematics, Statistics, Computer Science) ● Have 4 years of experience working in a similar role in a startup environment ● Python and SQL have no secrets for you ● Experience with working in GCP (or any other cloud solution) ● You solve math puzzles for fun ● A strong analytical mindset with a problem-solving attitude ● Comfortable with being critical and speaking your mind ● You are able to translate business problems into data science solutions ● Good English communication Tại sao ứng viên nên ứng tuyển vào vị trí này At BukuKas, we move fast. This is an opportunity to fast-track your career growth while tackling the exciting challenges that lie ahead. Own what you do and feel a sense of accomplishment as you get to see the impact that you made rippling across micro / small businesses. As an early mover in this space, you will be part of a team that is spearheading digitization efforts at a rapid scale for our merchants. Perhaps, you'll even find a sense of purpose and meaning in serving and providing a platform for these wonderful merchants - the backbone of Indonesia's economy but yet remain largely underserved and unbanked. Get onboard. Let's work together and make a difference in the lives of these merchants! - Medical insurance, BPJS Ketenagakerjaan, BPJS Kesehatan, Festive allowance (THR). - Device provided - 20 annual leave days (after passing the probation) - Probation: 3 months - Flexible working hours: Mon to Fri, 9 AM - 6 PM Báo cáo cho VP of BI (from the Netherlands) Quy trình phỏng vấn inerview with HR > Online assessment > Hiring Manager > Peer Interview Ghi chú cho người giới thiệu - Reason for rejection: Usually not have end to end process - This person will gather the data, create the metrics, A/B testing, then come up with the solution, validation > end to end process - Good English communication - Prefer candidates who are based in Indonesia. Open to recruit expats (from India, Vietnam, SEA in general) who are based out of Indonesia as long as they're willing to relocate to Indonesia after Covid - From startup environment > have same speed with us, build things from scratch - Prefer candidates from Grab, Gojek, Tokopedia, tech companies in general - Assignment: You might be given a short assignment to work on, which shouldn't take more than 2 days to complete About Bukukas - Founded in 2019 • Headquartered in Jakarta, Bangalore - Total members: 180 members - BukuKas's Tech Stack: Ruby on Rails, Node.js, React Native, Next.js, Docker, Kubernetes, Google Cloud
User avatar
Freelance Remote
Indonesia
2 months ago
Building a truly data-driven company by applying your skills and knowledge to cracking tough data challenges. You will be actively driving this vision by: ● Collect, transform and model data from various sources ● Build predictive models to help us figure out what our users need ● Design and develop machine learning applications to improve our products and boost user growth ● Guide experimentation to test new product features and growth initiatives Yêu cầu bắt buộc ● At least Bachelor Degree in a quantitative field (i.e. Mathematics, Statistics, Computer Science) ● Have 4 years of experience working in a similar role in a startup environment ● Python and SQL have no secrets for you ● Experience with working in GCP (or any other cloud solution) ● You solve math puzzles for fun ● A strong analytical mindset with a problem-solving attitude ● Comfortable with being critical and speaking your mind ● You are able to translate business problems into data science solutions ● Good English communication Tại sao ứng viên nên ứng tuyển vào vị trí này At BukuKas, we move fast. This is an opportunity to fast-track your career growth while tackling the exciting challenges that lie ahead. Own what you do and feel a sense of accomplishment as you get to see the impact that you made rippling across micro / small businesses. As an early mover in this space, you will be part of a team that is spearheading digitization efforts at a rapid scale for our merchants. Perhaps, you'll even find a sense of purpose and meaning in serving and providing a platform for these wonderful merchants - the backbone of Indonesia's economy but yet remain largely underserved and unbanked. Get onboard. Let's work together and make a difference in the lives of these merchants! - Medical insurance, BPJS Ketenagakerjaan, BPJS Kesehatan, Festive allowance (THR). - Device provided - 20 annual leave days (after passing the probation) - Probation: 3 months - Flexible working hours: Mon to Fri, 9 AM - 6 PM
User avatar
Freelance Non-Remote
Ha Noi, Vietnam
2 months ago
Tổng quan về công việc & trách nhiệm - Use modern data science tools (Python, Scala, Tensorflow, Py/Spark) to develop large-scale ML models on millions of structured and unstructured records - Apply deep learning to text, network, and time-series data to personalize our customer's experience at scale - End-to-end process from developing machine learning, to deployment and AB testing for all our millions of customers - Work closely with business users in identifying business problems and use cases that can be resolved by data science techniques - Constantly keep up with open source tools and research community for data science Yêu cầu bắt buộc - Technical degree including Sciences or Engineering - 2+ years of hands-on experience with ML, coding in Python/PySpark, distributed computing - Good English communication (because working directly with Department Head who is a foreigner) - Have an inquisitive mind, research and generate ideas, be comfortable with large-scale data - Understand the modern machine learning landscape and its mathematical foundations - Common sense, business-driven, results-oriented, agile mentality, do-it, and own-it culture - Strong communication (written, spoken), presentation skills, eye-level with business and PO **When referring the candidate, please submit the answer to these questions as below: Please rate yourself (1-5, 1 worst, 5 best) for the following topics: • Theory of standard ML algorithms • Un/supervised algorithms, how they work, optimization • Coding (python or R) • Data types and how to use those for solving business problems • Probability and statistics • Generic mathematics • Logical thinking Also: • How many years of hands-on ML with big data? • How many years of coding in python/r (which one?) • How many models (1,2,3,4…) developed end to end, deployed and measured results with clear metrics on business impact Ưu tiên với ứng viên - PhD or MSc in technical field - Open Source research project, GitHub contribution, activity in the online research community - Experience in people and project management (leading), business relevant skills - Experience in deployment of machine learning solutions and full stack development is a plus Tại sao ứng viên nên ứng tuyển vào vị trí này We always strive to attract, develop, engage and retain employees with our best. We believe "people" is the most valuable resource so that we provide all eligible team members with a comprehensive set of benefits designed to protect their physical and financial health. It includes but not limited to: - Health Plans; - Financial wellness; - Paid time off; - Flexible schedule; - Engagement activities; - Modern facilities You will grow with Techcombank by having the opportunity to learn from top experts from across the world. Techcombank provides a rewarding remuneration structure that commensurates with your achievement and contribution; Techcombank is the Top 2 Best place to work in the banking industry where you can experience various exciting activities throughout the year: - Company anniversary, - Team building, - Active Saturday Off, - Year-End Party, etc. Báo cáo cho Head of Advanced analytics and innovation & Head of Innovation Quy trình phỏng vấn Round 1 - Interview with Data Scientist Lead > Round 2 - Interview with Head Ghi chú cho người giới thiệu - Địa chỉ làm việc: Bà Triệu, Hà Nội - Salary: Up to 100,000,000 VND/ month – local Vietnamese - Salary: Up to USD 6,500/ month – Oversea Vietnamese - Phỏng vấn 2 vòng, hỏi sâu về thuật toán, machine learning, mathermatics, python R. + Vòng 1 phỏng vấn với Data Scientist Lead + Vòng 2 phỏng vấn với Head of Advanced analytics and innovation hoặc Head of Innovation - Team làm việc và trao đổi bằng tiếng Anh nên cần ứng viên có khả năng sử dụng thành thạo tiếng Anh trong công việc (head của team là người nước ngoài) - Vị trí này sẽ chịu trách nhiệm về bài toán dự báo và đề xuất + Ví dụ TCB muốn bán sản phẩm, bạn này sẽ phân tích phân khúc KH và dự đoán tỷ lệ thành công > đưa ra portfolio các KH các đặc điểm chung > để đội Kinh doanh action. + Ví dụ TCB muốn launch sản phẩm để có doanh thu đạt X tỷ, bạn này sẽ đề xuất tiếp cận đối tượng KH nào, sử dụng thuật toán để đưa ra dự báo cho câu chuyện tương lai. - Vị trí này sẽ xử lý Big Data (TCB tự tin có khối lượng dữ liệu lớn), phải xử lý dữ liệu lớn, làm về machine learning, deep learning,… - Khi gửi ứng viên, vui lòng gửi thêm câu trả lời của ứng viên cho các câu hỏi sau: Please rate yourself (1-5, 1 worst, 5 best) for the following topics: • Theory of standard ML algorithms • Un/supervised algorithms, how they work, optimization • Coding (python or R) • Data types and how to use those for solving business problems • Probability and statistics • Generic mathematics • Logical thinking Also: • How many years of hands-on ML with big data? • How many years of coding in python/r (which one?) • How many models (1,2,3,4…) developed end to end, deployed, and measured results with clear metrics o
User avatar
Freelance Non-Remote
Hanoi, Vietnam
2 months ago
A. Data Architecture - Deliver functionality required for business and data analysts, data scientists, and other business roles to advance the overall analytic performance and strategy of the bank - Build the best practices and strategies for data infrastructure to fulfill data analytic and utilization needs of the business with emerging latest technologies and capabilities. - Guide the team or teams in identifying opportunities to manage data and provide solutions for complex data feeds within the bank. - Evaluate various data architectures in the bank and utilize them to develop data solutions to meet business requirements. - Drive the delivery of data products and services into systems and business processes in compliance with internal regulatory requirements. Oversee the review of internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. B. Data Integration - Strategically obtain and integrate data and information from various sources into the firm’s platforms, solutions, and statistical models. - Lead a discussion with Data Scientists to understand the data requirements and create re-usable data assets to enable data scientists to build and deploy machine learning models faster. - Design, build and maintain optimized data pipelines and ETL solutions as business support tools in providing analysis and real-time analytics platform for critical decision making. Ensure data assets are organized and stored in an efficient way so that information is high quality, reliable, flexible, and efficient. C. Project Management - Manage project conflicts, challenges and dynamic business requirements to keep operations running at high performance. - Work with team leads to resolve people problems and project roadblocks, conduct post mortem and root cause analyses to improve practices for maximum productivity. Yêu cầu bắt buộc - Bachelor's or Master’s degree in Statistics, Mathematics, Quantitative Analysis, Computer Science, Software Engineering or Information Technology - 12 to 15 years of relevant experience with developing, debugging, scripting and employing big data technologies (e.g. Hadoop, Spark, Flink, Kafka, Arrow, Tableau), database technologies (e.g. SQL, NoSQL, Graph databases), and programming languages (e.g. Python, R, Scala, Java, Rust, Kotlin) with preference towards functional/trait oriented, including 5+ years of equivalent managerial roles - Good English communication (because working directly with Department Head who is a foreigner) - Deep experience in designing and building dimensional data models, ETL processes, applied data warehouse concepts and methodologies, optimized data pipelines and wore the architect hat in the past or worked with one extensively - Deep experience with monitoring complex system and solving data and systems issues having a consistent and algorithmic approach to resolving them Tại sao ứng viên nên ứng tuyển vào vị trí này We always strive to attract, develop, engage and retain employees with our best. We believe "people" is the most valuable resource so that we provide all eligible team members with a comprehensive set of benefits designed to protect their physical and financial health. It includes but not limited to: - Health Plans; - Financial wellness; - Paid time off; - Flexible schedule; - Engagement activities; - Modern facilities You will grow with Techcombank by having the opportunity to learn from top experts from across the world. Techcombank provides a rewarding remuneration structure that commensurates with your achievement and contribution; Techcombank is the Top 2 Best place to work in the banking industry where you can experience various exciting activities throughout the year: - Company anniversary, - Team building, - Active Saturday Off, - Year-End Party, etc.
User avatar
Freelance Remote
Indonesia
3 months ago
(indonesia/Full-time Remote) Building a truly data-driven company by applying your skills and knowledge to cracking tough data challenges. You will be actively driving this vision by: ● Collect, transform and model data from various sources ● Build predictive models to help us figure out what our users need ● Design and develop machine learning applications to improve our products and boost user growth ● Guide experimentation to test new product features and growth initiatives Yêu cầu bắt buộc ● At least Bachelor Degree in a quantitative field (i.e. Mathematics, Statistics, Computer Science) ● Have 4 years of experience working in a similar role in a startup environment ● Python and SQL have no secrets for you ● Experience with working in GCP (or any other cloud solution) ● You solve math puzzles for fun ● A strong analytical mindset with a problem-solving attitude ● Comfortable with being critical and speaking your mind ● You are able to translate business problems into data science solutions ● Good English communication Why At BukuKas, we move fast. This is an opportunity to fast-track your career growth while tackling the exciting challenges that lie ahead. Own what you do and feel a sense of accomplishment as you get to see the impact that you made rippling across micro / small businesses. As an early mover in this space, you will be part of a team that is spearheading digitization efforts at a rapid scale for our merchants. Perhaps, you'll even find a sense of purpose and meaning in serving and providing a platform for these wonderful merchants - the backbone of Indonesia's economy but yet remain largely underserved and unbanked. Get onboard. Let's work together and make a difference in the lives of these merchants! - Medical insurance, BPJS Ketenagakerjaan, BPJS Kesehatan, Festive allowance (THR). - Device provided - 20 annual leave days (after passing the probation) - Probation: 3 months - Flexible working hours: Mon to Fri, 9 AM - 6 PM Báo cáo cho VP of BI (from the Netherlands) Interview process inerview with HR > Online assessment > Hiring Manager > Peer Interview Ghi chú cho người giới thiệu - Reason for rejection: Usually not have end to end process - This person will gather the data, create the metrics, A/B testing, then come up with the solution, validation > end to end process - Good English communication - Prefer candidates who are based in Indonesia. Open to recruit expats (from India, Vietnam, SEA in general) who are based out of Indonesia as long as they're willing to relocate to Indonesia after Covid - From startup environment > have same speed with us, build things from scratch - Prefer candidates from Grab, Gojek, Tokopedia, tech companies in general - Assignment: You might be given a short assignment to work on, which shouldn't take more than 2 days to complete About Bukukas - Founded in 2019 • Headquartered in Jakarta, Bangalore - Total members: 180 members - BukuKas's Tech Stack: Ruby on Rails, Node.js, React Native, Next.js, Docker, Kubernetes, Google Clou
User avatar
Full Time Non-Remote
Bratislava, Slovakia
$5k per month 3 months ago
Would you like to work with Big Data at one of the leading TravelTech companies in Central Europe and create software tools used by tens to hundreds of developers in their day to day job? Ou client provides innovative TravelTech solutions for customers and businesses. The unique online search engine allows users to combine transportation from carriers that normally do not cooperate. Travel itineraries allow users to combine flights and ground transportation from over 800 carriers. Your Responsibilities: * Manage the Data Lake, consisting of transactional data from our booking as well as stream of events from our frontend. * Write code in Python / Scala / Go to implement parts of the Data Lake, focusing on automatization to the highest extent possible, which includes implementing batch pipelines to incrementally load data as well as implementing streaming pipelines. * Design and implement company Data Lake from transactional and streaming data * Write ML algorithms and get AI to production to actually have an impact on the product * Identify weak spots, refactor code that needs it during development * Optimize code and usage of 3rd party services for speed and cost effectiveness Must-have Skills: * 2+ years of full-time experience in a similar position * Strong coding skills in Python or Scala * Advanced query language (SQL) knowledge * Hands on experience with at least 2 of them – e.g. PostgreSQL, MySQL, Redshift, ElasticSearch * Experience with orchestration tools (Airflow ideally) * Experience in Big Data processing engines such as Apache Spark, Apache Beam and its cloud runners Dataproc/Dataflow * Knowledge of ML/AI algorithms like OLS and Gradient Descent and its application from linear regression to deep neural networks * Experience with Batch and Real-time data processing * Cloud Knowledge (GCP is the best fit, alternatively AWS or Azure) * BS/MS in Computer Science or a related field (ideal) Company offers: * Quarterly bonuses * Stock options * 20+5 days vacation / year * Meal vouchers, Cafeteria program, sick days, VIP Medical Care, Multisport card * Hardware from Apple or Microsoft based on your preferences * Permanent & full time employment * Flexible working schedule * Salary €4,250 per month Location 150 km radius around Bratislava (Slovakia), incl. Brno (CZ), Vienna (Austria), Nitra (SK) Why To Join This Team? * Transforming and enhancing the business by making use of Data * It feels like a startup within a scale up company! * Fast paced & ambitious growing company... which means a lot of data to process! * Great team spirit and autonomy to deliver results the way you prefer. Great Team: * The Data Intelligence Tribe currently holds 5 teams with about 30 Data specialists... so there are lots of professionals to learn from.
 * This will at least double it within the next 12 months... so you'll have a chance to onboard new joiners.
 * Teams are led by both Tech leads and Product managers... so you get the best from both worlds.
 * Our colleagues collaborate extensively across team boundaries and tribes... which strengthens your seniority and professional growth.
User avatar
Full Time Remote
Anywhere
$4.8k per month 3 months ago
Would you like to work with terabytes of Big Data at one of the leading TravelTech companies in Central Europe and create software tools used by tens to hundreds of developers in their day to day job? Our client provides innovative TravelTech solutions for customers and businesses. The unique online search engine allows users to combine transportation from carriers that normally do not cooperate. Travel itineraries allow users to combine flights and ground transportation from over 800 carriers. Your Responsibilities: * Develop and automate large scale, high-performance data processing systems (batch and/or streaming) to drive business growth. * Build scalable data platform products leveraging Airflow scheduler/executor framework. * Work with modern cloud data stack on GCS, Pub/Sub, Kafka, BigQuery, Postgres, Looker. * Build scalable, reliable, secure, efficient and highly performant platforms and infrastructure for a variety of analytics and business applications. * Contribute on data pipelines tooling to implement and extend its functionality using Python such that all users of the framework benefit from it. * Contribute to shared data engineering tooling & standards to improve the productivity and quality of output for data engineers across the company. * Improve data quality by using & improving internal tools to automatically detect issues. Requirements Must-have Skills: * Strong coding skills in Python or Scala * Big Data engineering in the Cloud (ideally GCP) * 5+ years of experience in software development * Broad knowledge of different types of data storage engines (relational, non-relational) * Hands on experience with at least 2 of them – e.g. PostgreSQL, MySQL, Redshift, ElasticSearch * Experience with orchestration tools (Airflow ideally) * Advanced query language (SQL) knowledge * Cloud Knowledge – Google Cloud (best fit), AWS, Azure Location: * 150 km radius around Bratislava (Slovakia), incl. Brno (CZ), Vienna (Austria), Nitra (SK) Company offers: * Quarterly bonuses * Stock options * 20+5 days vacation / year * Meal vouchers, Cafeteria program, sick days, VIP Medical Care, Multisport card * Hardware from Apple or Microsoft based on your preferences * Permanent & full time employment * Flexible working schedule * Salary €4025,00 Why To Join This Team? * Transforming and enhancing the business by making use of Data * It feels like a startup within a scale up company! * Fast paced & ambitious growing company... which means a lot of data to process! * Great team spirit and autonomy to deliver results the way you prefer. Great team: * The Data Intelligence Tribe currently holds 5 teams with about 30 Data specialists... so there are lots of professionals to learn from.
 * This will at least double it within the next 12 months... so you'll have a chance to onboard new joiners.
 * Teams are led by both Tech leads and Product managers... so you get the best from both worlds.
 * Our colleagues collaborate extensively across team boundaries and tribes... which strengthens your seniority and professional growth.
User avatar
Freelance Remote
Singapore
4 months ago
Tổng quan về công việc & trách nhiệm - You will participate to the development that creates the world's #1 app stores analytics service - Together with the team you will build out new product features and applications using agile methodologies and hot open source technologies - You will work with Product Manager, Software Architects, and will be on the front lines of coding and have a chance to be an end to end data expert, which covers data scraping, data processing, data storage, data analytics/mining You will be responsible for and take pride in.... - Implement and maintain App Annie Unified Data Warehouse and Data Pipeline, which provides below capabilities upon hundreds of trillions records & Petabytes volume columnar data - Platform: Centralized one platform cross different Cloud provider for all teams to share workload - Data: One source of truth for all App Annie businesses - Performance: Seconds to sub seconds level analytics query response - Cost: Better machine cost and human cost efficiency is always our pursuit - Being confident in dealing with big data challenges with high quality output - Communicate well with teams over sea in both spoken and written English - Becoming better at what you do every day Yêu cầu bắt buộc - Deep understanding of hardware and computer organization, Linux operating system, computer network and compilers etc. - Solid skills of big data related data structures and algorithms - Proficient in distributed computing and distributed storage - Experienced in data processing such as ETL - Experienced with big data ecosystem(Computation: Mapreduce, Spark/Flink, Presto/Hive/Redshift/Snowflake etc.; Storage: Postgresql, Elasticsearch, HDFS, Kafka etc.) , know AWS/Google Cloud/Microsoft Azure - Strong problem solving, analytical and troubleshooting skills - Good ability of English based communication, knowledge of Mandarin is a huge plus since you will be working closely with Beijing office - Energy and creativity are key characteristics that describe you and the projects you are involved in. You make it happen. Boom! Ưu tiên với ứng viên - Proficient programming experience in SQL and Python, Scala/Java is a big plus Tại sao ứng viên nên ứng tuyển vào vị trí này - We provide a WFH allowance to set you up for remote work success. - Internet allowance for stable internet connection, so your video does not freeze on Zoom. - Flexible working days. We love to meet, but if you need to get your kids behind school-zoom, need to leave early to get to your band repetition or gym classes, do your thing. - Paid leave, so long as you promise to come back! - Health and dental benefits. - An international team of talented and engaged people from different cultural backgrounds and locations. - Wellbeing allowance for any activity that matters to your wellbeing; (online) gym classes, fitness equipment, mindfulness apps or even childcare support! - Unlimited access to online learning platform Udemy to help you develop your skills. - Virtual initiatives and events to keep you connected with your colleagues.
User avatar
Full Time Remote
Anywhere
4 months ago
Our company needs a professional, effective and result-oriented team to design, develop and support complex enterprise Big Data solutions Responsibilities: -implement new features -bugfix existing features -refactor existing code -improve test coverage -build and ship product artifacts to dev/prod cluster
User avatar
Internship Remote
Anywhere
1 year ago
MCAT 101 is a new platform designed to engage students in the process of their prep and encourage them to learn from each other and have fun! Students often spend hundreds and thousands of dollars during this time when these resources could be provided for free and learned from one another. Here at MCAT 101, we are trying to create an organization to provide free tutoring services and resources for students. You do not have to be interested in the MCAT at all to even help though! If you are interested in any volunteer/internship opportunities for the fall, we have a place for you. Some opportunities may be in our premed blog, running our account as a student support team, answering any science questions students may have, coding our website, or even developing our team further through business. We have a startup style culture in a great fast-paced team and are looking for passionate people that want to make an impact. Come join us! If you are interested, please send a resume and paragraph of interest at mcat101help@gmail.com