Position Overview:
An AWS Architect at Affinity Plus builds secure, resilient and highly scalable solutions while keeping simplicity and operational effectiveness in mind. This role collaborates with teammates, product teams, and cross-functional project teams to lead the adoption and integration of the AWS architecture services into the enterprise.You will be responsible for implementing securely architected solutions that are operationally reliable, performant, and deliver on strategic initiatives of tangible, data-driven outcomes.
Duties and Responsibilities: Work closely with team members to lead, design, and drive enterprise solutions, advising on key decision points on trade-offs, best practices, and risk mitigation Use a defense-in-depth approach when designing and deploying performant systems that appropriately auto-scale, are highly available, fault-tolerant, self-monitoring, and serviceable Practice AWS Well-Architected six pillars - operational excellence, security, reliability, performance, efficiency, cost optimization, and sustainability Create and maintain secure and performant network designs; assisting with troubleshooting as needed Build out new API integrations Assemble large, complex data sets into workstreams that meet functional and non-functional business requirements Assist and advise data engineers in the preparation and delivery of raw data for prescriptive and predictive modeling into the data lake Be impeccable in your version control while performing commits, branching, and security. Security version control best practices include data encryption, user authentication and authorization, access controls, audit trails, and threat detection. Automate infrastructure provisioning when it makes sense and strive to ensure migrated workloads are cloud-native Partner with developers to continuously improve their ability to develop and deploy applications Build infrastructure for optimal extraction, loading and transformation of data from a wide variety of data sources Work with the developers to troubleshoot, maintain and monitor scalable data pipelines Perform root cause analysis to answer specific business questions and identify opportunities for process improvement Collaborate with Enterprise Digital Intelligence (Edi) team to improve data workflows that feed business intelligence tools to increase data accessibility for staff and foster data-driven decision-making across the organization Use observability and SIEM tools to monitor data and services ensuring production data is secure, has integrity and is available for key stakeholders and the business processes that depend on it Work in a hybrid workflow environment using agile methodologies as well as waterfall project/product management Employ change management best practices to ensure that services remain readily accessible to the business Maintain tools, processes and associated documentation to manage the compute environment Implement reusable design templates and solutions to integrate, automate, and orchestrate cloud operational needs Be a good steward and practice effective cloud governance controls in cloud operations Manage and monitor Windows, Red Hat and CentOS Linux operating systems using tools like Systems Manager and RH Satellite Readily communicate to leadership on topics including outages, updates on key infrastructure items, audit mitigation progress, and security vulnerabilities. Other duties as assigned Qualifications and Skills:
Required Qualifications and Skills 2+ years' experience with data lakes, i.e. Databricks, Snowflake, Amazon S3 and/or Lake Formation, etc. 3+ years' of related experience in designing secure, scalable and cost-effective big data architecture 5+ years' experience in a software development, data engineering, or data analytics field using Python, Scala, Spark, Java, or equivalent technologies Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience Mid-level knowledge of code versioning tools [such as Git, Mercurial or SVN] Expert proficiency in Lambda, Python, C++, Java, R, and SQL programming languages Expert proficiency in IaC tools, i.e. Terraform, Ansible, Cloud Formation Proficiency in software engineering best practices employed in the software development lifecycle, including coding standards, code reviews, source control management, build processes, testing and operations Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of unstructured data Strong understanding and knowledge of financial industry technology standards, compliance requirements, and experience working with audit and regulatory bodies Mid-level experience of the AWS Cloud platform core foundational native services and expertise in the Big Data & AI first party services such as AWS Glue, Amazon Athena, Amazon Kinesis, Amazon QuickSight, Crawlers Proficiency in working with all types of operating systems, especially Linux and Unix. Proficient level experience with architecture design, build and optimization of big data collection, ingestion, storage, processing, and visualization Proficient in building, automating and deploying data pipelines and workflows into end-user facing applications Ability to remain up to date with industry standards and technological advancements that will enhance data quality and reliability to advance strategic initiatives Technical expertise with data models, data mining, and segmentation techniques Working knowledge of RESTful APIs, OAuth2 authorization framework and security best practices for API Gateways Expert at diagnostic and problem resolution providing third-level support Familiarity of working with unstructured data sets (i.e. voice, image, log files, social media posts, email) Possess an organized methodical approach and bring a continuous improvement mindset Demonstrated predisposition for action, willingness to partner, mentor, and an overall innate drive to provide an exceptional member and employee experience Highly creative and innovative technologist that thrives independently and collaborates well in a team environment Strong analytical and decision-making skills with a high degree of accuracy Strong verbal, written, and interpersonal communication skills Time Management skills and the ability to prioritize workloads Preferred Qualifications Experience in a financial institution Expert-level knowledge of AWS infrastructure configurations and services offering Expert-level knowledge of data frameworks, data lakes and open-source projects such as Apache Spark, MLflow, and Delta Lake Experience with MDM using data governance solutions Advanced technical certifications: AWS Certified Cloud Practitioner, Solutions Architect, Certified Developer, or SysOps Administrator certifications; RHEL RHCSA/RHCE; AWS Certified Data Analytics, DASCA Big Data Engineering and Analytics preferred Workplace Environment: Working in a stationary position for 80% of the work day Utilizing the telephone and video conferencing 10-20% of the day Moving, lifting and/or carrying 30 pounds with or without accommodations Bending, twisting, kneeling, stooping or crouching when appropriate, on occasion Repetitive movements, including but not limited to typing, mousing, phones, etc. May require travel for an onsite presence for employee meetings & events for collaboration, connection, project work, All-Employee Day, etc. Required Work Schedule:
Standard Monday through Friday business hours with participation in a 24/7 on-call rotation as well as a willingness to work afterhours as needed for upgrades, feature rollouts, etc.Consistent and reliable attendance is a required essential function of this role to meet the needs of the department/team and organization.
This position has the ability to be based virtually, but does require travel to St. Paul, MN for training, company events, project initiatives and team and department meetings.
Equal Opportunity Employer/Protected Veterans/Individuals with Disabilities
The contractor will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information. 41 CFR 60-1.35(c)
Job Type Full-time Description The American Registry of Radiologic Technologists (ARRT) seeks a Registered Technologist to join our team as an Exam Dev...
American Registry Of Radiologic Technologists - Santo Domingo
Publicado a month ago
Description Our team is growing, and we are hiring at NDX Twin Cities Dental Lab in Falcon Heights, MN for a Dental Laboratory Technician Trainee (Removable...
National Dentex - Santo Domingo
Publicado a month ago
What Quality Assurance contributes to Cardinal Health Quality develops and implements quality policies, procedures, and processes to ensure products and ser...
Cardinal Health - Santo Domingo
Publicado a month ago
Apprentice Window Cleaner at Final Touch, L.LC. Shift:7am-3:30pm Monday-Friday Location: Will start at Marsden office in St. Paul and take company vehicle...
Marsden Central - Santo Domingo
Publicado a month ago
Built at: 2024-11-21T15:38:21.481Z