It's certainly possible to develop on a Windows machine, in fact, my first application was exclusively developed on the old Dell Precision I had at the time :)
There are three routes;
- Install OSx86 (aka iATKOS / Kalyway) on a second partition/disk and dual boot.
- Run Mac OS X Server under VMWare (Mac OS X 10.7 (Lion) onwards, read the update below).
- Use Delphi XE4 and the macincloud service. This is a commercial toolset, but the component and lib support is growing.
The first route requires modifying (or using a pre-modified) image of Leopard that can be installed on a regular PC. This is not as hard as you would think, although your success/effort ratio will depend upon how closely the hardware in your PC matches that in Mac hardware - e.g. if you're running a Core 2 Duo on an Intel Motherboard, with an NVidia graphics card you are laughing. If you're running an AMD machine or something without SSE3 it gets a little more involved.
If you purchase (or already own) a version of Leopard then this is a gray area since the Leopard EULA states you may only run it on an "Apple Labeled" machine. As many point out if you stick an Apple sticker on your PC you're probably covered.
The second option is more costly. The EULA for the workstation version of Leopard prevents it from being run under emulation and as a result, there's no support in VMWare for this. Leopard server, however, CAN be run under emulation and can be used for desktop purposes. Leopard server and VMWare are expensive, however.
If you're interested in option 1) I would suggest starting at Insanelymac and reading the OSx86 sections.
I do think you should consider whether the time you will invest is going to be worth the money you will save though. It was for me because I enjoy tinkering with this type of stuff and I started during the early iPhone betas, months before their App Store became available.
Alternatively, you could pick up a low-spec Mac Mini from eBay. You don't need much horsepower to run the SDK and you can always sell it on later if you decide to stop development or buy a better Mac.
Update: You cannot create a Mac OS X Client virtual machine for OS X 10.6 and earlier. Apple does not allow these Client OSes to be virtualized. With Mac OS X 10.7 (Lion) onwards, Apple has changed its licensing agreement in regards to virtualization. Source: VMWare KnowledgeBase
On Windows 7/8/10, you can install Chocolatey, which has a script for this built-in.
After installing Chocolatey, just type refreshenv
.
Best Answer
Note:
This answer shows how to switch the character encoding in the Windows console to
UTF-8 (code page
65001
), so that shells such ascmd.exe
and PowerShell properly encode and decode characters (text) when communicating with external (console) programs with full Unicode support, and incmd.exe
also for file I/O.[1]If, by contrast, your concern is about the separate aspect of the limitations of Unicode character rendering in console windows, see the middle and bottom sections of this answer, where alternative console (terminal) applications are discussed too.
As of (at least) Windows 10, version 1903, you have the option to set the system locale (language for non-Unicode programs) to UTF-8, but the feature is still in beta as of this writing.
To activate it:
intl.cpl
(which opens the regional settings in Control Panel)This sets both the system's active OEM and the ANSI code page to
65001
, the UTF-8 code page, which therefore (a) makes all future console windows, which use the OEM code page, default to UTF-8 (as ifchcp 65001
had been executed in acmd.exe
window) and (b) also makes legacy, non-Unicode GUI-subsystem applications, which (among others) use the ANSI code page, use UTF-8.Caveats:
If you're using Windows PowerShell, this will also make
Get-Content
andSet-Content
and other contexts where Windows PowerShell default so the system's active ANSI code page, notably reading source code from BOM-less files, default to UTF-8 (which PowerShell Core (v6+) always does). This means that, in the absence of an-Encoding
argument, BOM-less files that are ANSI-encoded (which is historically common) will then be misread, and files created withSet-Content
will be UTF-8 rather than ANSI-encoded.[Fixed in PowerShell 7.1] Up to at least PowerShell 7.0, a bug in the underlying .NET version (.NET Core 3.1) causes follow-on bugs in PowerShell: a UTF-8 BOM is unexpectedly prepended to data sent to external processes via stdin (irrespective of what you set
$OutputEncoding
to), which notably breaksStart-Job
- see this GitHub issue.Not all fonts speak Unicode, so pick a TT (TrueType) font, but even they usually support only a subset of all characters, so you may have to experiment with specific fonts to see if all characters you care about are represented - see this answer for details, which also discusses alternative console (terminal) applications that have better Unicode rendering support.
As eryksun points out, legacy console applications that do not "speak" UTF-8 will be limited to ASCII-only input and will produce incorrect output when trying to output characters outside the (7-bit) ASCII range. (In the obsolescent Windows 7 and below, programs may even crash).
If running legacy console applications is important to you, see eryksun's recommendations in the comments.
However, for Windows PowerShell, that is not enough:
$OutputEncoding
preference variable to UTF-8 as well:$OutputEncoding = [System.Text.UTF8Encoding]::new()
[2]; it's simplest to add that command to your$PROFILE
(current user only) or$PROFILE.AllUsersCurrentHost
(all users) file.If setting the system locale to UTF-8 is not an option in your environment, use startup commands instead:
Note: The caveat re legacy console applications mentioned above equally applies here. If running legacy console applications is important to you, see eryksun's recommendations in the comments.
For PowerShell (both editions), add the following line to your
$PROFILE
(current user only) or$PROFILE.AllUsersCurrentHost
(all users) file, which is the equivalent ofchcp 65001
, supplemented with setting preference variable$OutputEncoding
to instruct PowerShell to send data to external programs via the pipeline in UTF-8:chcp 65001
from inside a PowerShell session is not effective, because .NET caches the console's output encoding on startup and is unaware of later changes made withchcp
; additionally, as stated, Windows PowerShell requires$OutputEncoding
to be set - see this answer for details.$PROFILE
programmatically:For
cmd.exe
, define an auto-run command via the registry, in valueAutoRun
of keyHKEY_CURRENT_USER\Software\Microsoft\Command Processor
(current user only) orHKEY_LOCAL_MACHINE\Software\Microsoft\Command Processor
(all users):Optional reading: Why the Windows PowerShell ISE is a poor choice:
While the ISE does have better Unicode rendering support than the console, it is generally a poor choice:
First and foremost, the ISE is obsolescent: it doesn't support PowerShell Core, where all future development will go, and it isn't cross-platform, unlike the new premier IDE for both PowerShell editions, Visual Studio Code, which already speaks UTF-8 by default for PowerShell Core and can be configured to do so for Windows PowerShell.
The ISE is generally an environment for developing scripts, not for running them in production (if you're writing scripts (also) for others, you should assume that they'll be run in the console); notably, the ISE's behavior is not the same in all aspects when it comes to running scripts.
As eryksun points out, the ISE doesn't support running interactive external console programs, namely those that require user input:
If you're willing to live with that limitation, switching the active code page to
65001
(UTF-8) for proper communication with external programs requires an awkward workaround:You must first force creation of the hidden console window by running any external program from the built-in console, e.g.,
chcp
- you'll see a console window flash briefly.Only then can you set
[console]::OutputEncoding
(and$OutputEncoding
) to UTF-8, as shown above (if the hidden console hasn't been created yet, you'll get ahandle is invalid error
).[1] In PowerShell, if you never call external programs, you needn't worry about the system locale (active code pages): PowerShell-native commands and .NET calls always communicate via UTF-16 strings (native .NET strings) and on file I/O apply default encodings that are independent of the system locale. Similarly, because the Unicode versions of the Windows API functions are used to print to and read from the console, non-ASCII characters always print correctly (within the rendering limitations of the console).
In
cmd.exe
, by contrast, the system locale matters for file I/O (with<
and>
redirections, but notably including what encoding to assume for batch-file source code), not just for communicating with external programs in-memory (such as when reading program output in afor /f
loop).[2] In PowerShell v4-, where the static
::new()
method isn't available, use$OutputEncoding = (New-Object System.Text.UTF8Encoding).psobject.BaseObject
. See GitHub issue #5763 for why the.psobject.BaseObject
part is needed.