Home » Jobs » Ai Research Engineer Tether

AI Research Engineer: Tether

Aug 14, 2025   |   Location: Remote   |   Deadline: Not specified

Experience: Mid

Salary: $122k - $244k Estimated

Tether is seeking an AI Research Engineer to join its AI model team. This role is a hands-on, research-driven position focused on advancing AI through innovation in model architecture and pre-training optimization. The work involves driving innovation for small, large, and multi-modal models to enhance intelligence and efficiency. The ideal candidate will have deep expertise in LLM architectures and a strong grasp of pre-training optimization. The company is a global talent powerhouse, and this position offers the opportunity to work on the cutting edge of AI and fintech.

Key Responsibilities
Pre-training: Conduct pre-training of AI models on large, distributed servers equipped with thousands of NVIDIA GPUs.

Architecture: Design, prototype, and scale innovative model architectures to enhance intelligence and efficiency.

Experimentation: Independently and collaboratively execute experiments, analyze results, and refine methodologies for optimal performance.

Optimization: Investigate, debug, and improve both model efficiency and computational performance.

Systems: Contribute to the advancement of training systems to ensure seamless scalability and efficiency.

Research: Explore and implement novel techniques and algorithms, focusing on data curation, strengthening baselines, and resolving pre-training bottlenecks.

Requirements
Education: A degree in Computer Science or a related field is required. A Ph.D. in NLP, Machine Learning, or a related field with a strong track record in AI R&D (including publications in A* conferences) is ideal.

Experience:

Hands-on experience contributing to large-scale LLM training runs on large, distributed servers with thousands of NVIDIA GPUs.

Practical experience with large-scale, distributed training frameworks, libraries, and tools.

Technical Skills:

Deep knowledge of state-of-the-art transformer and non-transformer modifications for enhancing intelligence, efficiency, and scalability.

Strong expertise in PyTorch and Hugging Face libraries with practical experience in model development, continual pre-training, and deployment.

Soft Skills: Excellent English communication skills.
🚀 Apply Now

👀 10 views   |   🚀 0 clicks

🧠 Related Jobs