This morning, I woke up to find the following comment in the MATLAB® Newsgroup:
Over two years ago, MathWorks® started to build a clone of Jacket, which you now know as the GPU computing support in the Parallel Computing Toolbox (TM). At the time, there were many naysayers suggesting that Jacket would somehow be eclipsed by the clone. Made sense, right?
Wrong! Here we are 2 years later and the clone is still a poor imitation. There are several technical reasons for this, but if you are serious about getting great performance from your GPU, Jacket is the better option. Look at all the real customers that are getting big benefit. Here are some other recent benchmarks from the Walking Randomly Blog that show Jacket on a laptop is faster than PCT (TM) on a Tesla:
If it were easy to imitate Jacket, then MathWorks® would have siphoned away all the Jacket users. The truth is that it is not easy to build great GPU software, and the Jacket user base continues to explode. Jacket is not only better than PCT (TM) but is also getting better at a faster rate. Here’s to another 2, 5, and 10+ years of great speeds for all of you Jacket programmers!
To Juliette and others out there, if you really want PCT (TM) to get better, you might consider asking MathWorks® to spending less time cloning and more time working with others who are adding value to the MATLAB® ecosystem.