Job Responsibilities : Salesforce Development : Develop and maintain custom Salesforce applications using Apex, Visualforce, Lightning Components, and Salesforce APIs.Customize Salesforce platform features, workflows, and processes.Collaborate with business users to translate requirements into technical solutions within the Salesforce ecosystem.Integrate Salesforce with external systems, ensuring seamless data flow and integration.Hadoop Development : Develop and manage big data solutions using Hadoop and related technologies (MapReduce, Hive, Pig, etc.Design and implement data pipelines for efficient data extraction, transformation, and loading (ETL) from various sources to Hadoop.Utilize Hadoop ecosystems like HDFS, YARN, and Spark for large-scale data processing.Work with data architects to ensure optimal storage and retrieval from Hadoop clusters.System Integration : Integrate Hadoop data solutions with Salesforce for analytics, reporting, and data processing.Collaborate with cross-functional teams (data engineers, data scientists, and Salesforce admins) to ensure data consistency and performance.Troubleshoot integration issues between Salesforce and Hadoop-based systems.Performance Tuning and Optimization : Optimize Salesforce code and data structures for better performance.Monitor and optimize Hadoop jobs and clusters for performance and resource utilization.Documentation & Reporting : Maintain clear documentation for Salesforce development, customizations, and integrations.Prepare and present reports on system performance, data analysis, and integration results.Skills and Qualifications : Salesforce : Proficient in Apex, Visualforce, Lightning Components, and Salesforce APIs.Strong knowledge of Salesforce administration, customization, and best practices.Experience with Salesforce data models, security, and deployment tools (e.g., Change Sets, Salesforce DX, Ant Migration ToolCertifications like Salesforce Platform Developer I / II or Salesforce Administrator are a plus.Hadoop : Strong experience with Hadoop ecosystem tools, including HDFS, YARN, Hive, Pig, and Spark.Experience in managing large datasets in a distributed computing environment.Proficiency in writing and optimizing MapReduce jobs.Experience with ETL processes and tools (Apache NiFi, Sqoop, etc in Hadoop.Other Skills : Strong problem-solving skills and ability to troubleshoot complex technical issues.Excellent communication and collaboration skills, with the ability to work across teams.Experience with integration tools like MuleSoft, Apache Kafka, or Talend is a plus.Familiarity with Agile development methodologies and DevOps tools (Git, Jenkins, etc. Equal Opportunity Employer We are an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, national origin, citizenship / immigration status, veteran status, or any other status protected under federal, state, or local law.
Salesforce Developer • Remote, Work from Home, United States of America