Redshift hadooplavori

Filtro

Le mie ultime ricerche
Filtra per:
Budget
a
a
a
Tipo
Competenze
Lingue
    Stato del lavoro
    2,000 redshift hadoop lavori trovati, prezzi in EUR

    installare hadoop su cloud e interfacciare web services

    €346 (Avg Bid)
    €346 Offerta media
    3 offerte

    ...ideal freelancer for this project should have proven experience in time series forecasting, Python programming, and developing interactive dashboards. A solid grasp of data visualization principles would be a plus. The output should be in a form of a python file with 4 models above and related dynamic visualizations. The text preprocessing is required and can be done in any necessary technologies (HADOOP/SPARK,NOSQL/SQL databases) - screenshots of the bash code are required....

    €30 (Avg Bid)
    €30 Offerta media
    13 offerte

    I'm seeking intermediate level big data engineers who can tackle essential tasks within the d...capable of combining data from various sources to create a cohesive dataset. - Develop data visualization: The engineers will need to create meaningful and understandable visual representations of the analyzed data. The necessary skills for this role are: - Proficient in Python: The candidate should be well-versed in Python programming language to work with our system. - Knowledge of Hadoop: Experience with Hadoop is a must for this position to handle the data integration and data storage tasks. - Knowledge of Kafka Prior experience in big data engineering and a good understanding of data engineering principles will be vital. Please note that this project suits an intermedia...

    €14 / hr (Avg Bid)
    €14 / hr Offerta media
    26 offerte

    ...specifically Cassandra, BigQuery, Snowflake, and Redshift. Key Responsibilities: - Research, understand, and articulate the distinct approaches of and the other specified databases - Translate complex concepts into clear, concise, and reader-friendly articles Ideal Candidate Should Have: - You need to have very deep expertise in databases and distributed systems. - Ideally, a Ph.D. or some deep research writing experience. Any conference publications in top conferences are a plus. - An understanding of database architectures - Prior experience writing technical articles for a technical audience - The ability to explain complex topics in an easy-to-understand manner - Knowledge about Cassandra, BigQuery, Snowflake, and Redshift will be a big plus. In the scope of this ...

    €11 / hr (Avg Bid)
    €11 / hr Offerta media
    27 offerte

    ...seeking a 3D Artist proficient in Houdini to accomplish a photorealistic rendering project. Skills and Experience: - Sound Houdini knowledge, particularly associated with importing USD file formats and high-quality rendering. - Proven ability to deliver photorealistic visualization. Your role: - Import an animated USD file into Houdini. - Utilize your skills to render the animation, with the redshift renderer. - a demonstration of the process used Your interest in this project confirms you are capable and equipped to meet the requirements. I anticipate inspiring bids from proficient Houdini users....

    €163 (Avg Bid)
    €163 Offerta media
    3 offerte

    Seeking a skilled developer to optimize and enhance the architecture of our existing web scraper application. The application is currently built using NestJS and PostgreDB, and we are looking to scale it up and leverage cloud functionality for improved p...error handling, rate limiting, and IP rotation. - Strong problem-solving skills and ability to optimize application performance. - Excellent communication and collaboration skills. Nice to have: - Experience with PostgreDB and database optimization techniques. - Knowledge of additional programming languages like Python or Java. - Familiarity with data processing frameworks like Apache Spark or Hadoop. - Experience with data visualization and reporting tools. Potential for ongoing collaboration based on performance and future req...

    €508 (Avg Bid)
    €508 Offerta media
    76 offerte

    I need someone to create the dag and trigger it I am working on a migration project from hadoop to bigquery....more details will be shared via chat

    €16 (Avg Bid)
    €16 Offerta media
    5 offerte

    I am seeking a skilled data engineering trainer, speed in using Hadoop, Apache Spark, and SQL is paramount. Your expertise will guide me through nuanced uses of these technologies, with a particular focus on data migration. Key Requirements: - Proficiency in Hadoop, Apache Spark, and SQL - More than 10 hours availability weekly - Proven experience in real-world data migration projects Ideal candidates should have a flair for explaining complex concepts in simple language. This engagement will focus on moving data from diverse sources into a data warehouse, thereby making it readily available for business intelligence functions.

    €7 / hr (Avg Bid)
    €7 / hr Offerta media
    6 offerte

    I'm in need of a proficient professional versed in Java Hadoop cluster. Please place your bids immediately. $20 for this project

    €3 / hr (Avg Bid)
    €3 / hr Offerta media
    3 offerte

    I am in urgent need of Hadoop/Spark developer who is proficient in both Scala and Python for a data processing task. I have a huge volume of unstructured data that needs to be processed and analyzed swiftly and accurately. Key Project Responsibilities: - Scrubbing and cleaning the unstructured data to detect and correct errors. - Designing algorithms using Scala and Python to process data in Hadoop/Spark. - Ensuring effective data processing and overall system performance. The perfect fit for this role is a professional who has: - Expertise in Hadoop and Spark frameworks. - Proven experience in processing unstructured data. - Proficient coding skills in both Scala and Python. - Deep understanding of data structures and algorithms. - Familiarity with data ...

    €23 / hr (Avg Bid)
    €23 / hr Offerta media
    39 offerte

    I'm a beginner in AWS and would like to receive hands-on training on specific services. My goals for this training are to: - Gain a thorough understanding of the Cloud formation service - Learn best practices and use-cases of the RDS and S3 services - Explore the data handling capabilities of Glue and RedShift - Understand the working of EC2 and its significance - Get insights on SQS/SNS services and their usage I'm looking for an AWS expert who has experience in these services, and can provide me with clear explanations and practical examples. A structured approach to learning, with emphasis on real-world applications, would be greatly beneficial. Please share your relevant experience and teaching methodologies in your proposal.

    €11 / hr (Avg Bid)
    €11 / hr Offerta media
    19 offerte

    ...and natural language processing 3. Strong proficiency in programming languages such as Python, Java, and C++, as well as web development frameworks like Node.js and React 4. Experience with cloud computing platforms such as AWS, Azure, or Google Cloud, and containerization technologies like Docker and Kubernetes 5. Familiarity with data engineering and analytics tools and techniques, such as Hadoop, Spark, and SQL 6. Excellent problem-solving and analytical skills, with the ability to break down complex technical challenges into manageable components and solutions 7. Strong project management and communication skills, with the ability to collaborate effectively with both technical and non-technical stakeholders 8. Familiarity with agile development methodologies and best pr...

    €1495 (Avg Bid)
    NDA
    €1495 Offerta media
    97 offerte

    We are looking for an Informatica BDM developer with 7+ yrs of experience, who can support us for 8 hours in a day from Mon - Friday. Title : Informatica BDM Developer Experience : 5 + Yrs 100%Remote Contract : Long term Timings: 10:30 am - 07:30 pm IST Required Skills: Informatica Data Engineering, DIS and MAS • Databricks, Hadoop • Relational SQL and NoSQL databases, including some of the following: Azure Synapse/SQL DW and SQL Database, SQL Server and Oracle • Core cloud services from at least one of the major providers in the market (Azure, AWS, Google) • Agile Methodologies, such as SCRUM • Task tracking tools, such as TFS and JIRA

    €1207 (Avg Bid)
    €1207 Offerta media
    5 offerte

    ...schedules. Timings : 10:00 am - 07:00 pm IST Remote (India) • Collaborate with business analysts to understand and gather requirements for existing or new ETL pipelines. -Connect with stakeholders daily to discuss project progress and updates. -Work within an Agile process to deliver projects in a timely and efficient manner. -Have worked extensively on Redshift and understands performance tuning techniques and management of Redshift data workloads. -Design and develop Airflow DAGs to schedule and manage ETL workflows. -Implement best practices for data engineering, including data modeling, data warehousing, and data pipeline architecture. -Monitor and troubleshoot ETL pipelines to ensure smooth operation. -Custom Python functions to handle data quality and validat...

    €1375 (Avg Bid)
    €1375 Offerta media
    22 offerte

    As the owner of a large SQL Server database, I'm looking for the expertise to convert my SQL Server code to Redshift SQL. The database in question is greater than 10GB and primarily contains valuable customer data. The ideal freelancer for this project should: - Have a strong understanding of SQL and Redshift - Experience in data conversion of Stored procedures, views, functions etc., specifically from SQL Server to Redshift - Familiarity with transforming and restructuring customer data Your job will be to ensure the correct and efficient migration of all customer data without loss of any rows. Be prepared to handle the larger data volume of this SQL Server database

    €17 / hr (Avg Bid)
    €17 / hr Offerta media
    10 offerte

    ...which will include parameters such as patient age ranges, geographical regions, social conditions, and specific types of cardiovascular diseases. Key responsibilities: - Process distributed data using Hadoop/MapReduce or Apache Spark - Developing an RNN model (preferably Python) - Analyzing the complex CSV data (5000+ records) - Identifying and predicting future trends based on age, region, types of diseases and other factors - Properly visualizing results in digestible diagrams Ideal candidates should have: - Experience in data analysis with Python - Solid understanding of Hadoop/MapReduce or Apache Spark - Proven ability in working with Recurrent Neural Networks - Excellent visualization skills to represent complex data in static or dynamic dashboards - Experien...

    €455 (Avg Bid)
    €455 Offerta media
    84 offerte

    I am looking for an experienced Senior Data Engineer for interview training. Your primary responsibility would be data cleaning and preprocessing, design and optimize database and implem...preprocessing, design and optimize database and implement ETL processes. Key responsibilities include: - Clean and preprocess data to ensure its quality and efficiency. - Design and optimize databases, aiming for both flexibility and speed. - Implement ETL (Extract, Transform, Load) processes to facilitate the effective and secure moving of data. Skills and Experience: - Proficient in Python, SQL, and Hadoop. - Expertise in handling medium-sized databases (1GB-1TB). - Proven track record in ETL processes handling. Your expertise in these areas will be crucial to the successful completion of thi...

    €49 / hr (Avg Bid)
    €49 / hr Offerta media
    17 offerte

    I have encountered a problem with my Hadoop project and need assistance. My system is showing ": HADOOP_HOME and are unset", and I am not certain if I've set the HADOOP_HOME and variables correctly. This happens creating a pipeline release in devops. In this project, I am looking for someone who: - Has extensive knowledge about Hadoop and its environment variables - Can determine whether I have set the HADOOP_HOME and variables correctly and resolve any issues regarding the same - Able to figure out the version of Hadoop installed on my system and solve compatibility issues if any I will pay for the solution immediately.

    €20 / hr (Avg Bid)
    €20 / hr Offerta media
    15 offerte

    *Title: Freelance Data Engineer* *Description:* We are seeking a talented freelance data engineer to join our team on a project basis. The ideal candidate will have a strong background in data engineering, with expertise in designing, implementing, and maintaining data pipelines and infrastructure. You will work closely with our data scientists and analysts to ensure the smooth flow of data from various sources to our data warehouse, and to support the development of analytics and machine learning solutions. This is a remote position with flexible hours. *Responsibilities:* - Design, build, and maintain scalable and efficient data pipelines to collect, process, and store large volumes of data from diverse sources. - Collaborate with data scientists and analysts to understand data require...

    €77 (Avg Bid)
    €77 Offerta media
    3 offerte

    As a beginner, I am seeking a knowledgeable developer who can guide me on effectively using Google Cloud for Hadoop, Spark, Hive, pig, and MR. The main goal is data processing and analysis. Key Knowledge Areas Needed: - Google Cloud usage for big data management - Relevant functionalities of Hadoop, Spark, Hive, pig, and MR - Best practices for data storage, retrieval, and workflow streamlining Ideal Skills: - Extensive Google Cloud experience - Proficiency in Hadoop, Spark, Hive, Pig, and MR for data processes - Strong teaching abilities for beginners - Demonstrated experience in data processing and analysis.

    €156 (Avg Bid)
    €156 Offerta media
    14 offerte
    AWS Practical help Terminato left

    Explain the concept : Amazon RDS Amazon Aurora Amazon DynamoDB Amazon Neptune Amazon Memory DB for Redis DocumentDB Amazon QLDB Athena Overview Redshift Overview EMR Overview

    €6 / hr (Avg Bid)
    €6 / hr Offerta media
    7 offerte

    As a beginner, I am seeking a knowledgeable developer who can guide me on effectively using Google Cloud for Hadoop, Spark, Hive, pig, and MR. The main goal is data processing and analysis. Key Knowledge Areas Needed: - Google Cloud usage for big data management - Relevant functionalities of Hadoop, Spark, Hive, pig, and MR - Best practices for data storage, retrieval, and workflow streamlining Ideal Skills: - Extensive Google Cloud experience - Proficiency in Hadoop, Spark, Hive, Pig, and MR for data processes - Strong teaching abilities for beginners - Demonstrated experience in data processing and analysis.

    €168 (Avg Bid)
    €168 Offerta media
    26 offerte

    Budget: 5$ I...png. I am also attaching some logos below but feel free to browse the internet and find the png logos. All the logos provided should be standard sized, transparent background in png format with one color and one grayscale export. I need the following logos: SAP SAP Business Objects SAP Analytics Cloud Microsoft Microsoft Power BI Microsoft Power Apps Microsoft Azure Microsoft Fabric AWS AWS Redshift Google Google Cloud Platform Google Big Query Tableau Qlik Salesforce Zendesk For your application to be considered, please: - Be experienced in creating and editing logos - Include at least one example of a logo for me to see you can edit this properly. Attention to detail and a quick turnaround time are critical for this project. I look forward to your speedy ...

    €5 / hr (Avg Bid)
    €5 / hr Offerta media
    32 offerte

    As a beginner, I am seeking a knowledgeable developer who can guide me on effectively using Google Cloud for Hadoop, Spark, Hive, pig, and MR. The main goal is data processing and analysis. Key Knowledge Areas Needed: - Google Cloud usage for big data management - Relevant functionalities of Hadoop, Spark, Hive, pig, and MR - Best practices for data storage, retrieval, and workflow streamlining Ideal Skills: - Extensive Google Cloud experience - Proficiency in Hadoop, Spark, Hive, Pig, and MR for data processes - Strong teaching abilities for beginners - Demonstrated experience in data processing and analysis.

    €19 (Avg Bid)
    €19 Offerta media
    11 offerte

    ...commonly used packages specially with GCP. Hands on experience on Data migration and data processing on the Google Cloud stack, specifically: Big Query Cloud Dataflow Cloud DataProc Cloud Storage Cloud DataPrep Cloud PubSub Cloud Composer & Airflow Experience designing and deploying large scale distributed data processing systems with few technologies such as PostgreSQL or equivalent databases, SQL, Hadoop, Spark, Tableau Hands on Experience with Python-Json nested data operation. Exposure or Knowledge of API design, REST including versioning, isolation, and micro-services. Proven ability to define and build architecturally sound solution designs. Demonstrated ability to rapidly build relationships with key stakeholders. Experience of automated unit testing, automated integra...

    €12 / hr (Avg Bid)
    €12 / hr Offerta media
    11 offerte

    I am looking for a skilled professional who can efficiently set up an big data cluster. REQUIREMENTS: • Proficiency in Elasticsearch,Hadoop,Spark,Cassandra • Experience in working with large-scale data storage (10+ terabytes). • Able to structure data effectively. SPECIFIC TASKS INCLUDE: - Setting up the Elasticsearch,Hadoop,Spark,Cassandra big data cluster. - Ensuring the data to be stored is structured. - Prep for the ability to handle more than 10 terabytes of data. The ideal candidate will have substantial experience in large data structures and a deep understanding of the bigdata database technology. I encourage experts in big data management and those well-versed with the best practices of bigdata to bid for this project.

    €28 / hr (Avg Bid)
    €28 / hr Offerta media
    3 offerte

    ...tasks that need to be performed to ensure that our data structures run smoothly and effectively. Specifically: - One of your main responsibilities would be to add 5 new calculated columns in our Redshift table. To do this, you will need to update existing Glue jobs that are written in Python. Test this in DEV, UAT and Production. - Furthermore, you will need to create or update views and stored procedures in Redshift to create a tableau extract. The core competences that candidates need to possess are: - Extensive experience with AWS data stack including Glue (used with Python), S3, Redshift etc. - Strong understanding of data structures and databases. - Solid knowledge of Python. - A proven track record working on similar projects would be adva...

    €918 (Avg Bid)
    €918 Offerta media
    18 offerte

    I'm looking to have a SQL view query written in Redshift. With no specific tables indicated, the freelancer should be adept enough to identify the relevant tables during the development process. Requirement: - Expertise in SQL and Amazon Redshift. What To Include In Your Proposal: - Please include any past work related to SQL queries, especially if you've previously worked with Redshift. Deliverable: - The use of this SQL view query is for reporting purposes, so it needs to be efficient and reliable.

    €71 (Avg Bid)
    €71 Offerta media
    6 offerte

    We are looking for an Informatica BDM developer with 7+ yrs of experience, who can support us for 8 hours in a day from Mon - Friday. Title : Informatica BDM Developer Experience : 5 + Yrs 100%Remote Contract : Long term Timings: 10:30 am - 07:30 pm IST Required Skills: Informatica Data Engineering, DIS and MAS • Databricks, Hadoop • Relational SQL and NoSQL databases, including some of the following: Azure Synapse/SQL DW and SQL Database, SQL Server and Oracle • Core cloud services from at least one of the major providers in the market (Azure, AWS, Google) • Agile Methodologies, such as SCRUM • Task tracking tools, such as TFS and JIRA

    €1130 (Avg Bid)
    €1130 Offerta media
    3 offerte

    I am seeking a skilled professional proficient in managing big data tasks with Hadoop, Hive, and PySpark. The primary aim of this project involves processing and analyzing structured data. Key Tasks: - Implementing Hadoop, Hive, and PySpark for my project to analyze large volumes of structured data. - Use Hive and PySpark for sophisticated data analysis and processing techniques. Ideal Skills: - Proficiency in Hadoop ecosystem - Experience with Hive and PySpark - Strong background in working with structured data - Expertise in big data processing and data analysis - Excellent problem-solving and communication skills Deliverables: - Converting raw data into useful information using Hive and Visualizing the results of queries into the graphical representations. - C...

    €16 / hr (Avg Bid)
    €16 / hr Offerta media
    15 offerte
    Data Engineer Terminato left

    ...custom scripts as needed. ● Ensure the efficient extraction, transformation, and loading of data from diverse sources into our data warehouse. Data Warehousing: ● Design and maintain data warehouse solutions on AWS, with a focus on scalability, performance, and reliability. ● Implement and optimize data models for efficient storage and retrieval in AWS Redshift. AWS Service Utilization: ● Leverage AWS services such as S3, Lambda, Glue, Redshift, and others to build end-to-end data solutions. ● Stay abreast of AWS developments and recommend the adoption of new services to enhance our data architecture. SQL Expertise: ● Craft complex SQL queries to support data analysis, reporting, and business intelligence requirements. ● Optimize SQL code for performance and efficiency, ...

    €2164 (Avg Bid)
    €2164 Offerta media
    8 offerte

    ...R), and other BI essentials, join us for global projects. What We're Looking For: Business Intelligence Experts with Training Skills: Data analysis, visualization, and SQL Programming (Python, R) Business acumen and problem-solving Effective communication and domain expertise Data warehousing and modeling ETL processes and OLAP Statistical analysis and machine learning Big data technologies (Hadoop, Spark) Agile methodologies and data-driven decision-making Cloud technologies (AWS, Azure) and data security NoSQL databases and web scraping Natural Language Processing (NLP) and sentiment analysis API integration and data architecture Why Work With Us: Global Opportunities: Collaborate worldwide across diverse industries. Impactful Work: Empower businesses through data-drive...

    €19 / hr (Avg Bid)
    €19 / hr Offerta media
    24 offerte
    AWS Expert Terminato left

    ...understanding of cloud and cloud native principles and practices · Hands-on experience with cloud environments - AWS, GCP, Azure · AWS Services:EC2, EC2 container service, Lambda, Elastic beanstalk, S3, EFS, Storage gateway, Glacier, VPC, Direct connect, Transit Gateway, ELB, Auto Scaling, ACM, Cloud Front, Cloud Formation, Cloud Watch, Cloud Trail, SNS, SES, SQS, SWF, IAM, RDS, DynamoDB, Elasticache, Redshift, AWS Backup · Operating Systems: UNIX, Redhat LINUX, Windows · Networking & Protocols: TCP/IP, Telnet, HTTP, HTTPS, FTP, SNMP, LDAP, DNS, DHCP, ARP, SSL, IDM 6.0 and 7.0 · DevOps Tools: Puppet, Chef, Subversion (SVN), GIT, Jenkins, Hudson, Puppet, Ansible, Docker and Kubernetes · Scripting Languages: UNIX Shell Scripting (Bour...

    €7 / hr (Avg Bid)
    €7 / hr Offerta media
    1 offerte

    I'm launching an extensive project that needs a proficient expert in Google Cloud Platform (including BigQuery, GCS, Airflow/Composer), Hadoop, Java, Python, and Splunk. The selected candidate should display exemplary skills in these tools, and offer long-term support. Key Responsibilities: - Data analysis and reporting - Application development - Log monitoring and analysis Skills Requirements: - Google Cloud Platform (BigQuery, GCS, Airflow/Composer) - Hadoop - Java - Python - Splunk The data size is unknown at the moment, but proficiency in managing large datasets will be advantageous. Please place your bid taking into account all these factors. Your prior experience handling similar projects will be a plus. I look forward to working with a dedicated and know...

    €453 (Avg Bid)
    €453 Offerta media
    53 offerte

    I need an experienced AWS Data Engineer to develop a robust data warehouse for moving data from Sql Server to S3 using AWS DMS. Then from S3 to Redshift. The data, which is currently in CSV files, will need to be efficiently imported into AWS. The ultimate aim is to enable seamless analytics and reporting from the data warehouse. Ideal Skills and Experience: - Proficiency in AWS data warehousing - Strong experience with CSV files - Prior experience with Sales Data highly preferred - Clear understanding of data warehouse design and architecture. - Proficiency in AWS DMS , AWS S3. - AWS Email Notification.

    €113 (Avg Bid)
    €113 Offerta media
    7 offerte

    ...specifically Cassandra, BigQuery, Snowflake, and Redshift. Key Responsibilities: - Research, understand, and articulate the distinct approaches of and the other specified databases - Translate complex concepts into clear, concise, and reader-friendly articles Ideal Candidate Should Have: - You need to have very deep expertise in databases and distributed systems. - Ideally, a Ph.D. or some deep research writing experience. Any conference publications in top conferences are a plus. - An understanding of database architectures - Prior experience writing technical articles for a technical audience - The ability to explain complex topics in an easy-to-understand manner - Knowledge about Cassandra, BigQuery, Snowflake, and Redshift will be a big plus. In the scope of this ...

    €18 / hr (Avg Bid)
    €18 / hr Offerta media
    21 offerte

    ...commonly used packages specially with GCP. Hands on experience on Data migration and data processing on the Google Cloud stack, specifically: Big Query Cloud Dataflow Cloud DataProc Cloud Storage Cloud DataPrep Cloud PubSub Cloud Composer & Airflow Experience designing and deploying large scale distributed data processing systems with few technologies such as PostgreSQL or equivalent databases, SQL, Hadoop, Spark, Tableau Hands on Experience with Python-Json nested data operation. Exposure or Knowledge of API design, REST including versioning, isolation, and micro-services. Proven ability to define and build architecturally sound solution designs. Demonstrated ability to rapidly build relationships with key stakeholders. Experience of automated unit testing, automated integra...

    €13 / hr (Avg Bid)
    €13 / hr Offerta media
    6 offerte

    As part of a critical project, I'm looking for an AWS data engineer who has substantial experience in the field spanning 8-10 years. Key Responsibilities: - AWS data engineering - Handling data migration tasks - Working exclusively with structured data. Required Skills and Experience: - Deep knowledge of AWS services such as AWS Glue, Amazon Redshift, and Amazon Athena is mandatory. Prior experience working with these services is a prerequisite. - The candidate should have extensive experience in managing structured data - A minimum of 8-10 years experience in data engineering is required, specifically within the AWS environment - A strong background in data migration tasks is paramount for effective execution of duties. Given the nature of the data involved in this pr...

    €12 / hr (Avg Bid)
    €12 / hr Offerta media
    11 offerte

    I'm seeking a talented 3D artist to create a high-quality, realistic model of a bottle. My goal is to use this model for static product visualizations. Key Project Elements: - Craft a detailed 3D bottle model - Ensure realistic textures and materials - Optimize for static product visualization renders - Tailor specifically for rendering with Redshift in Cinema 4D Ideal Skills: - Proficiency in Cinema 4D and Redshift - Experience with product visualization - Strong portfolio of realistic 3D models - Ability to deliver detailed and accurate work The final output should be a meticulously detailed, photo-realistic 3D model that can be rendered in high-quality static images. I need an artist who can not only capture the intricate details of the bottle but also present it...

    €353 (Avg Bid)
    €353 Offerta media
    52 offerte

    As an ecommerce platform looking to optimize our data management, I require assistance with several key aspects of my AWS big data project, including: - Data lake setup and configuration - Development of AWS Glue jobs - Deployment of Hadoop and Spark clusters - Kafka data streaming The freelancer hired for this project must possess expertise in AWS, Kafka, and Hadoop. Strong experience with AWS Glue is essential given the heavy utilization planned for the tool throughout the project. Your suggestions and recommendations regarding these tools and technologies will be heartily welcomed, but keep in mind specific tools are needed to successfully complete this project.

    €783 (Avg Bid)
    €783 Offerta media
    20 offerte

    ...spearhead the operation. I plan to utilise Amazon S3, Amazon Redshift, and Amazon EMR for this project. I need your insight and hands-on experience with these AWS services, to effectively manage and shape the course of this project. The project will handle medium-sized data, approximately between 1TB-10TB. It indicates a large-scale job that demands acute attention and expertise. In regards to data security and privacy,itional encryption measures will be required on top of basic security measures. If you are well-versed in data encryption methods and can ensure the reliability and security of sensitive information, then you are who I am looking for. Skills and Experience: - Comprehensive knowledge of Amazon S3, Amazon Redshift, Amazon EMR - Experience with medium t...

    €829 (Avg Bid)
    €829 Offerta media
    26 offerte

    I am seeking an experienced AWS data architect capable of designing an optimized data architecture for my project. This would involve the integral use of multiple AWS services including Amazon S3, Amazon Redshift, Amazon Athena, AWS Glue, AWS Lambda, and AWS DynamoDB. It's crucial for the project that the freelancer has expertise in integrating AWS Aurora into this data structure. Key responsibilities: - Comprehensive understanding of Amazon S3, Redshift, Athena, Glue, Lambda, and DynamoDB. - Experience integrating AWS Aurora into data structures. - Design of highly optimized and scalable data architectures. - Familiarity with data migration and management strategies. - Robust problem-solving abilities with a keen attention to detail. Our ideal candidate is analytical...

    €21 / hr (Avg Bid)
    €21 / hr Offerta media
    41 offerte

    ...Queries: Write a SQL query to find the second highest salary. Design a database schema for a given problem statement. Optimize a given SQL query. Solution Design: Design a parking lot system using object-oriented principles. Propose a data model for an e-commerce platform. Outline an approach to scale a given algorithm for large datasets. Big Data Technologies (if applicable): Basic questions on Hadoop, Spark, or other big data tools. How to handle large datasets efficiently. Writing map-reduce jobs (if relevant to the role). Statistical Analysis and Data Processing: Write a program to calculate statistical measures like mean, median, mode. Implement data normalization or standardization techniques. Process and analyze large datasets using Python libraries like Pandas. Rememb...

    €7 / hr (Avg Bid)
    €7 / hr Offerta media
    36 offerte

    ...customer-centric software products · Analyze existing software implementations to identify areas of improvement and provide deadline estimates for implementing new features · Develop software applications using technologies that include and not limited to core Java (11+ ), Kafka or messaging system, Web Frameworks like Struts / Spring, relational (Oracle) and non-relational databases (SQL, MongoDB, Hadoop, etc), with RESTful microservice architecture · Implement security and data protection features · Update and maintain documentation for team processes, best practices, and software runbooks · Collaborating with git in a multi-developer team · Appreciation for clean and well documented code · Contribution to database design ...

    €1298 (Avg Bid)
    €1298 Offerta media
    50 offerte
    Hadoop administrator Terminato left

    Project Title: Advanced Hadoop Administrator Description: - We are seeking an advanced Hadoop administrator for an inhouse Hadoop setup project. - The ideal candidate should have extensive experience and expertise in Hadoop administration. - The main tasks of the Hadoop administrator will include data processing, data storage, and data analysis. - The project is expected to be completed in less than a month. - The Hadoop administrator will be responsible for ensuring the smooth functioning of the Hadoop system and optimizing its performance. - The candidate should have a deep understanding of Hadoop architecture, configuration, and troubleshooting. - Experience in managing large-scale data processing and storage environments is requi...

    €288 (Avg Bid)
    €288 Offerta media
    3 offerte

    ...pipe, streams, Stored procedure, Task, Hashing, Row Level Security, Time Travel etc. Proficiency in SQL, data structures, and database design principles. Strong experience in ETL or ELT Data Pipelines and various aspects, terminologies with Pure SQL like SCD Dimensions, Delta Processing etc. 3+ years of Experience of working with AWS cloud services- S3, Lambda, Glue, Athena, IAM, CloudWatch, Redshift etc. 5+ years of proven expertise in creating pipelines for real time and near real time integration working with different data sources - flat files, XML, JSON, Avro files and databases Excellent communication skills, including the ability to explain complex technical concepts clearly. Prior experience in a client-facing or consulting role is advantageous. Ability to manage p...

    €8370 (Avg Bid)
    €8370 Offerta media
    2 offerte

    I am looking for a freelancer to help me with a Proof of Concept (POC) project focusing on Hadoop. Requirement: We drop a file in HDFS, which is then pushed to Spark or Kafka and it pushes final output/results into a database. Objective is to show we can handle million of records as input and put it in destination. The POC should be completed within 3-4 days and should have a simple level of complexity. Skills and experience required: - Strong knowledge and experience with Hadoop - Familiarity with HDFS and Kafka/Spark - Ability to quickly understand and implement a simple POC project - Good problem-solving skills and attention to detail

    €157 (Avg Bid)
    €157 Offerta media
    9 offerte

    ...performance and efficiency Pyspark ,sql,python Cdk Typescript Aws glue ,Emr and andes Currently Migrating from teradata to aws. Responsibilities: - Migrate data from another cloud provider to AWS, ensuring a smooth transition and minimal downtime - Design and develop applications that utilize AWS Glue and Athena for data processing and analysis - Optimize data storage and retrieval using AWS S3 and Redshift, as well as other relevant AWS services - Collaborate with other team members and stakeholders to ensure project success and meet client requirements If you have a strong background in AWS migration, expertise in working with structured data, and proficiency in utilizing AWS Glue and Athena, then this project is perfect for you. Apply now and join our team to hel...

    €8 / hr (Avg Bid)
    €8 / hr Offerta media
    14 offerte

    I am seeking assistance with a research project focused on data warehouse implementation, specifically in the area of cloud-based data warehouses. Skills and experience required for this project include: - Strong knowledge of data warehousing concepts and principles - Experience with cloud-based data warehousing platforms, such as Amazon Redshift or Google BigQuery - Proficiency in data modeling and designing data warehouse schemas - Understanding of ETL (Extract, Transform, Load) processes and tools - Ability to analyze and integrate data from multiple sources - Familiarity with SQL and other programming languages for data manipulation and analysis The deliverable for this project is a comprehensive report that summarizes the research findings and provides recommendations for i...

    €25 (Avg Bid)
    €25 Offerta media
    4 offerte
    Hadoop HDFS Setup Terminato left

    ...of DataNode 3: Mike Set the last two digits of the IP address of each DataNode: IP address of DataNode 1: IP address of DataNode 2: IP address of DataNode 3: Submission Requirements: Submit the following screenshots: Use commands to create three directories on HDFS, named after the first name of each team member. Use commands to upload the Hadoop package to HDFS. Use commands to show the IP addresses of all DataNodes. Provide detailed information (ls -l) of the blocks on each DataNode. Provide detailed information (ls -l) of the fsimage file and edit log file. Include screenshots of the Overview module, Startup Process module, DataNodes module, and Browse Directory module on the Web UI of HDFS. MapReduce Temperature Analysis You are

    €14 (Avg Bid)
    €14 Offerta media
    2 offerte