It's certainly possible to develop on a Windows machine, in fact, my first application was exclusively developed on the old Dell Precision I had at the time :)
There are three routes;
- Install OSx86 (aka iATKOS / Kalyway) on a second partition/disk and dual boot.
- Run Mac OS X Server under VMWare (Mac OS X 10.7 (Lion) onwards, read the update below).
- Use Delphi XE4 and the macincloud service. This is a commercial toolset, but the component and lib support is growing.
The first route requires modifying (or using a pre-modified) image of Leopard that can be installed on a regular PC. This is not as hard as you would think, although your success/effort ratio will depend upon how closely the hardware in your PC matches that in Mac hardware - e.g. if you're running a Core 2 Duo on an Intel Motherboard, with an NVidia graphics card you are laughing. If you're running an AMD machine or something without SSE3 it gets a little more involved.
If you purchase (or already own) a version of Leopard then this is a gray area since the Leopard EULA states you may only run it on an "Apple Labeled" machine. As many point out if you stick an Apple sticker on your PC you're probably covered.
The second option is more costly. The EULA for the workstation version of Leopard prevents it from being run under emulation and as a result, there's no support in VMWare for this. Leopard server, however, CAN be run under emulation and can be used for desktop purposes. Leopard server and VMWare are expensive, however.
If you're interested in option 1) I would suggest starting at Insanelymac and reading the OSx86 sections.
I do think you should consider whether the time you will invest is going to be worth the money you will save though. It was for me because I enjoy tinkering with this type of stuff and I started during the early iPhone betas, months before their App Store became available.
Alternatively, you could pick up a low-spec Mac Mini from eBay. You don't need much horsepower to run the SDK and you can always sell it on later if you decide to stop development or buy a better Mac.
Update: You cannot create a Mac OS X Client virtual machine for OS X 10.6 and earlier. Apple does not allow these Client OSes to be virtualized. With Mac OS X 10.7 (Lion) onwards, Apple has changed its licensing agreement in regards to virtualization. Source: VMWare KnowledgeBase
The effect you’re looking for can be achieved with texture combiners in OpenGL ES 1.1. By default, each texture unit that you enable is set up to multiply the output of the previous stage by the color of the current texture. In the case of the first texture unit, the previous stage is simply the vertex color. By changing the texture combiner state, you can add, subtract, interpolate, or take dot products of your texture samples instead.
The second and third examples on the linked page, which interpolate between two textures, should be fairly similar to what you’re trying to do. If you compare the source code for the two examples, you should see that they’re nearly identical, except for the configuration for GL_SRC2_RGB
/GL_SRC2_ALPHA
and GL_OPERAND2_RGB
/GL_OPERAND2_ALPHA
. What’ll you'll need to specify here depends on where/how you’re generating the blend factor for the two textures. You can source from the vertex color by specifying GL_PRIMARY_COLOR
for GL_SRC2_*
, which isn’t shown in the examples.
(Note: the page I linked to recommends using GLSL instead of texture combiners. This is unfortunately not an option if your software needs to run on older hardware that doesn’t support OpenGL ES 2.0.)
Best Answer
Maybe I'm not getting you right, but to me it seems trivial and I've been doing it my apps successfully. The way to go is:
glEnable(GL_BLEND)
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA)
glColor4f(r * a, g * a , b * a, a)
The blend function is for porter-duff over using premultiplied colors/textures. The
GL_TEXTURE_ENV_MODE
must be set toGL_MODULATE
, but that's the default.