Digital design does not have a lot in common with software development (maybe except that Verilog syntax looks a bit like C language but it just looks). Thus it is very hard to answer this type of question adequately. But as a guy who walked a path from software development to hardware design I'll give it a shot. Looking back at myself, here is how I would have advised myself back then if I knew what I know now:
Start from scratch
Forget everything about software development. Especially programming languages. Those principles do not apply in digital design. It probably would be easy for a guy who designed a CPU to program it in assembler or even C, but an assembler programmer won't be able to design a CPU.
On your learning path do not tend to solve what seem to be an easy problem with your existing knowledge from software. One of the classic examples is a "for loop". Even though you can write a for loop in, say, verilog — it serves a different purposes. It is mainly used for code generation. It may also be a for loop as software developers see it, but it won't be good for anything but simulation (i.e. you won't be able to program FPGA like that).
So for every task you want to tackle, don't think you know how to do it, do a research instead — check books, examples, ask more experienced people etc.
Learn hardware and HDL language
The most popular HDL languages are Verilog and VHDL. There are also vendor-specific ones like AHDL (Altera HDL). Since those languages are used to describe hardware components, they are all pretty much used to express the same thing in a similar fashions but with a different syntax.
Some people recommend learning Verilog because it looks like C. Yes, its syntax is a mix of C and Ada but it doesn't make it easy for a software developer to lean. In fact, I think it may even make it worse because there will be a temptation to write C in Verilog. That's a good recipe for having a very bad time.
Having that in mind, I'd recommend staring from the VHDL. Though Verilog is also OK as long as the above is taken into account.
One important thing to keep in mind is that you must understand what you are expressing with that language. What kind of hardware is being "described" and how it works.
For that reason, I'd recommend you get yourself some book on electronics in general and a good book like this one — HDL Chip Design (aka as a blue book).
Get a simulator
Before you start doing anything in hardware and use any Vendor-specific features etc., get yourself a simulator. I was starting with a Verilog, and used Icarus Verilog along with GTK Wave. Those are free open-source projects. Run examples you see in books, practice by designing your own circuits to get some taste of it.
Get a development board
When you feel like going forward, get a development board. If you know that your employer wants to go with Lattice, then get Lattice board.
The programming methods are very similar, but there are details that are different. For example, different tools, different options, different interfaces. Usually, if you have experience with one vendor, it is not hard to switch. But you probably want to avoid this extra learning curve.
I'd also make sure that the board comes with components that you are planning to use or is extendable. For example, if you want to do design a network device like a router, make sure the board has Ethernet PHY or it can be extended through, say, HSMC connector, etc.
Boards usually come with a good reference, user guide and design examples. Study them.
Read books
You will need to read books. In my case, I had no friends who knew digital design, and this site wasn't very helpful either because of one simple thing — I didn't even know how to phrase my question. All I could come up with was like "Uhm, guys, there is a thing dcfifo and I heard something about clock domain crossing challenges, what is it and why my design doesn't work?".
I personally started with these:
FPGA vendors have a lot of cookbooks with best practices. Study them along with reference designs. Here is one from Altera, for example.
Come back with more specific questions
While you go through your books, simulate a design, blink some LEDs on your development board, you would, most likely, have a lot of questions. Make sure you don't see an answer to those on the next page of the book or online (i.e. in the Lattice-specific forum) before asking them here.
Best Answer
I have been playing with it some, and it is an interesting technology. Your OpenCL kernel gets instantiated as hardware, into a "sandbox" that is surrounded by PCIe and Memory. An optimum kernel for their flavor of OpenCL is heavily pipelined, because that is where FPGAs shine. There are options for vectorization as well as replication, loop unrolling etc. also.
Downsides: Very few FPGA development boards support OpenCL at this time, and they tend to be pretty pricey (think $5k). Compilation takes many hours (hey, it's an FPGA!) and you need a BIG machine in terms of RAM. They recommend a minimum of 24GB, but for anything that is more than hello, world\n, I would make sure you have 64GB at the very least. Not sure how much the licenses are, but as you are talking Stratix V parts and a separate license for the OpenCL SDK, it's certainly not free.
The way all this works is that the compiler (which is LLVM-based like most of the other OpenCL compilers) compiles your OpenCL kernel into an LLVM IR (Intermediate Representation), then an Altera tool generates Verilog from that. The Verilog instantiation of the kernel is embedded into a Qsys system for the other interfaces (board-dependent, mostly PCIe and RAM, clocks and resets, based on Avalon components). Then it grinds all this through the Quartus tool chain to make a bitfile, which is then wrapped into an AOCX file for easy consumption by an OpenCL host. This will take hours.
You can run this whole thing without knowing anything about FPGAs, which is pretty cool. To get something to work is easy, but how easy is it to build something that performs well? I don't know, still learning. There is an interesting optimization guide available here.
Overall, the technology definitely has a lot of potential, but it still feels somewhat bleeding edge (as of Quartus 13.1).