The RayCore 1000, produced by Siliconarts is the world’s first low power real-time ray-tracing graphics processor. Siliconarts are now making the design IP (intellectual property) available to chip manufacturers, allowing anyone to integrate the ray tracing engine into their graphics processor.
Taking ray tracing out of the production studio and into everyone’s hands Siliconarts is attempting to introduce the next generation graphics engine today. Producing sharp natural appearing images with ultra realistic lighting and shadows is only the beginning.
Siliconarts claims to already have one customer who plans to implement the RayCore in silicon. Here at Highpants we suspect the first phone manufacturer to produce a fully ray traced interface on a Smartphone would have something special on their hands. Not only would it be visually stunning but it would seriously differentiate the phone in crowded market.
Siliconarts Achievement
Ray Tracing is an ancient art form used extensively by Hollywood to generate their latest animated tales. Generated using entire server farms it is barely ever done in real time. The breakthrough from Siliconarts is reducing the entire process down to a tiny bit of silicon. Solving these two challenges could make Ray Tracing an everyday experience.
Siliconarts have taken a sensible approach with the release of the RayCore 1000, allowing it to be integrated into current graphics processors designs. Instead of trying to introduce an entirely new chipset the RayCore is simply added to manufacturers design inventories, adding a new option.
Also, by targeting this initial chip at the mobile market, a market that has traditionally been thought to be too underpowered to ray trace, Siliconarts will be breaking new ground instead of following past failures.
Software is still thin on the ground for the RayCore, drivers for open standard libraries like OpenRL have to be an imperative. RayCore is already compatible with various hardware interfaces; ABMA, AXI, PCI Express. Android Froyo and ICS are also supported.
Ray Tracing and Rasterization
Ray tracing and rasterization represent two different evolving standards for producing 3D graphics, the major difference between the two is approach. Ray tracing approaches the problem with quality in mind, quality first no matter the CPU time. While rasterization’s approach emphasizes real time frame rates, limiting the workload to maintain the frame rate, do whatever it takes but in one 50th of a second. Siliconarts are attempting to bridge the two approaches.
The first Ray Tracing algorithm was completed in 1980 by mathematician Turner Whitted. The process itself involves following a beam of light as it reflects off surfaces it is cast onto, producing a far more natural result. The same process used by 3D animation software such as 3D Max and Lightwave.
The 3D images that we see on our PC’s and touch screen devices uses a different technique to generate 3D graphics, a process called Rasterization. All games and video cards accelerate the rasterization process which builds a scene and objects from a wireframe of triangles. This wireframe is then rendered out producing the image.
History
Many software and hardware attempts have been made to replace rasterization with ray tracing. NVIDIA has been promising that all games would be ray traced within 10 years, a claim made for the last 12 years.
2007 saw Intel working on the top secret Larrabee project, a massively parallel multi processor destined to become a graphics processor and real time ray tracer. With 40 simple CPU cores running in parallel Larrabee was hoped to be Intel’s foot in the graphics processor door.
Early in the projects life it was touted as being fast enough to allow real time ray tracing. Intel ported Quake IV at the time, releasing the Quake IV Ray Traced edition. Using an 8 core Xeon server it ran at 12 fps. There was also a buzz created when industry pundits first learned that Quake IV was running on a Xeon and not on Larrabee.
The entire Larrabee project was eventually cancelled and Intel to this day produces anaemic graphics processors, while real time ray tracing remains a pipe dream. An interesting article ‘Real Time Ray-Tracing: The End of Rasterization?’ was published during that period.
Graphics chip designers PowerVR have hardware available that can bring real time ray tracing to the workstation user interface. Caustic sell a product that uses the PowerVR hardware to display real time ray traced graphics in the Maya preview window.
The hardware accelerator makes use of the open software platform OpenRL to bring it all together. The OpenRL API library has been established to allow a standard way for hardware and software developers to work together; it is one of the great hopes for ray traced graphics on the desktop.
Conclusion
The common argument of Ray Tracing versus Rasterization is actually a mute point as they are two different approaches used for two very different purposes. Ray Tracing will steal ideas from Rasterization and vice versa as they both continue to evolve.
Taking ray tracing to places it has never been before, Siliconarts are bringing a new class of graphics to handsets everywhere. Will we ever see a real time ray traced interface on our desktops and smartphones? Here at Highpants we think the answer is yes, the real question is when?
Reference: Siliconarts
Reference: Xbitlabs
Reference: Real Time Ray-Tracing: The End of Rasterization?
Reference: Wikipedia, Ray Tracing
Reference: Raytracing versus Rasterization
Reference: Imagination Real time ray tracing
Reference: Tom’s Hardware: When will ray tracing replace rasterization?
You must be logged in to post a comment.