For more than two decades, NVIDIA has pioneered visual computing, the art and science of computer graphics.

With a singular focus on this field, we offer specialized platforms for the gaming, professional visualization, data center and automotive markets.

Our work is at the center of the most consequential mega-trends in technology — virtual reality, artificial intelligence and self-driving cars.

NVIDIA Zürich, ZH, Switzerland
Full time
Today, we stand at the beginning of the AI computing era, ignited by a new computing model, GPU deep learning. This new model-where deep neural networks are trained to recognize patterns from massive amounts of data-has shown to be deeply effective at solving some of the most complex problems in everyday life. In this context, we are looking for a   Rendering & Machine Learning Research Intern. You will be working with us on investigating the application of deep- and other machine-learning techniques to physically based rendering-the synthesis of realistic images by simulating the propagation of light through the virtual world. We leverage machine learning to increase the efficiency and quality of rendering algorithms with the ambitious eventual goal of a photorealistic real-time rendering engine. What you’ll be doing Inventing novel algorithms that live on the intersection of light-transport simulation and machine learning, together with other researchers. Prototyping and evaluating these algorithms. Documenting your progress with the possibility of eventually publishing at an international conference. What we need to see You are pursuing a PhD in Computer Science or a related field. You have a strong background in computer graphics. You are familiar with Monte Carlo integration methods. You are familiar with general concepts of machine learning as well as neural networks. You have experience programming in C++ (ideally also in CUDA) as well as in Python. Dedication to mathematical correctness and high-quality software. Ways to stand out from the crowd You possess a real passion for rendering and/or machine learning. Familiarity with a popular research renderer (e.g. Mitsuba, PBRT, or Tungsten)   and   popular machine-learning frameworks (e.g. TensorFlow, PyTorch, SciKit Learn). A track record of open-source projects, e.g. as part of university courses or as a hobby. Existing publications at international conferences in either computer graphics or machine learning. Strong expertise in machine learning. Strong written, verbal, or mathematical skills. At NVIDIA, our employees are passionate about parallel and visual computing. We're united in our quest to transform the way graphics are used for work and play. Our technology impacts the visual experience in video game development, film production, space exploration, medicine, computational finance and automotive design. And we've only scratched the surface of what we can accomplish when we apply our technology to it. We need passionate, hard-‐working and creative people to help us seek some of these unrivaled opportunities. With highly competitive salaries and a comprehensive benefits package, NVIDIA is widely considered to be one of the technology world’s most desirable employers. We have some of the most brilliant and talented people in the world working for us and, due to extraordinary growth, our elite engineering teams are growing fast. NVIDIA is committed to fostering a diverse work environment and proud to be an equal opportunity employer. As we highly value diversity in our current and future employees, we do not discriminate (including in our hiring and promotion practices) on the basis of race, religion, colour, national origin, gender, gender expression , sexual orientation, age, marital status, veteran status, disability status or any other characteristic protected by law.