Voltage Rating
If a device says it needs a particular voltage, then you have to assume it needs that voltage. Both lower and higher could be bad.
At best, with lower voltage the device will not operate correctly in a obvious way. However, some devices might appear to operate correctly, then fail in unexpected ways under just the right circumstances. When you violate required specs, you don't know what might happen. Some devices can even be damaged by too low a voltage for extended periods of time. If the device has a motor, for example, then the motor might not be able to develop enough torque to turn, so it just sits there getting hot. Some devices might draw more current to compensate for the lower voltage, but the higher than intended current can damage something. Most of the time, lower voltage will just make a device not work, but damage can't be ruled out unless you know something about the device.
Higher than specified voltage is definitely bad. Electrical components all have voltages above which they fail. Components rated for higher voltage generally cost more or have less desirable characteristics, so picking the right voltage tolerance for the components in the device probably got significant design attention. Applying too much voltage violates the design assumptions. Some level of too much voltage will damage something, but you don't know where that level is. Take what a device says on its nameplate seriously and don't give it more voltage than that.
Current Rating
Current is a bit different. A constant-voltage supply doesn't determine the current: the load, which in this case is the device, does. If Johnny wants to eat two apples, he's only going to eat two whether you put 2, 3, 5, or 20 apples on the table. A device that wants 2 A of current works the same way. It will draw 2 A whether the power supply can only provide the 2 A, or whether it could have supplied 3, 5, or 20 A. The current rating of a supply is what it can deliver, not what it will always force thru the load somehow. In that sense, unlike with voltage, the current rating of a power supply must be at least what the device wants but there is no harm in it being higher. A 9 volt 5 amp supply is a superset of a 9 volt 2 amp supply, for example.
Replacing Existing Supply
If you are replacing a previous power supply and don't know the device's requirements, then consider that power supply's rating to be the device's requirements. For example, if a unlabeled device was powered from a 9 V and 1 A supply, you can replace it with a 9 V and 1 or more amp supply.
Advanced Concepts
The above gives the basics of how to pick a power supply for some device. In most cases that is all you need to know to go to a store or on line and buy a power supply. If you're still a bit hazy on what exactly voltage and current are, it's probably better to quit now. This section goes into more power supply details that generally don't matter at the consumer level, and it assumes some basic understanding of electronics.
Regulated versus Unregulated
Unregulated
Very basic DC power supplies, called unregulated, just step down the input AC (generally the DC you want is at a much lower voltage than the wall power you plug the supply into), rectify it to produce DC, add a output cap to reduce ripple, and call it a day. Years ago, many power supplies were like that. They were little more than a transformer, four diodes making a full wave bridge (takes the absolute value of voltage electronically), and the filter cap. In these kinds of supplies, the output voltage is dictated by the turns ratio of the transformer. This is fixed, so instead of making a fixed output voltage their output is mostly proportional to the input AC voltage. For example, such a "12 V" DC supply might make 12 V at 110 VAC in, but then would make over 13 V at 120 VAC in.
Another issue with unregulated supplies is that the output voltage not only is a function of the input voltage, but will also fluctuate with how much current is being drawn from the supply. A unregulated "12 volt 1 amp" supply is probably designed to provide the rated 12 V at full output current and the lowest valid AC input voltage, like 110 V. It could be over 13 V at 110 V in at no load (0 amps out) alone, and then higher yet at higher input voltage. Such a supply could easily put out 15 V, for example, under some conditions. Devices that needed the "12 V" were designed to handle that, so that was fine.
Regulated
Modern power supplies don't work that way anymore. Pretty much anything you can buy as consumer electronics will be a regulated power supply. You can still get unregulated supplies from more specialized electronics suppliers aimed at manufacturers, professionals, or at least hobbyists that should know the difference. For example, Jameco has wide selection of power supplies. Their wall warts are specifically divided into regulated and unregulated types. However, unless you go poking around where the average consumer shouldn't be, you won't likely run into unregulated supplies. Try asking for a unregulated wall wart at a consumer store that sells other stuff too, and they probably won't even know what you're talking about.
A regulated supply actively controls its output voltage. These contain additional circuitry that can tweak the output voltage up and down. This is done continuously to compensate for input voltage variations and variations in the current the load is drawing. A regulated 1 amp 12 volt power supply, for example, is going to put out pretty close to 12 V over its full AC input voltage range and as long as you don't draw more than 1 A from it.
Universal input
Since there is circuitry in the supply to tolerate some input voltage fluctuations, it's not much harder to make the valid input voltage range wider and cover any valid wall power found anywhere in the world. More and more supplies are being made like that, and are called universal input. This generally means they can run from 90-240 V AC, and that can be 50 or 60 Hz.
Minimum Load
Some power supplies, generally older switchers, have a minimum load requirement. This is usually 10% of full rated output current. For example, a 12 volt 2 amp supply with a minimum load requirement of 10% isn't guaranteed to work right unless you load it with at least 200 mA. This restriction is something you're only going to find in OEM models, meaning the supply is designed and sold to be embedded into someone else's equipment where the right kind of engineer will consider this issue carefully. I won't go into this more since this isn't going to come up on a consumer power supply.
Current Limit
All supplies have some maximum current they can provide and still stick to the remaining specs. For a "12 volt 1 amp" supply, that means all is fine as long as you don't try to draw more than the rated 1 A.
There are various things a supply can do if you try to exceed the 1 A rating. It could simply blow a fuse. Specialty OEM supplies that are stripped down for cost could catch fire or vanish into a greasy cloud of black smoke. However, nowadays, the most likely response is that the supply will drop its output voltage to whatever is necessary to not exceed the output current. This is called current limiting. Often the current limit is set a little higher than the rating to provide some margin. The "12 V 1 A" supply might limit the current to 1.1 A, for example.
A device that is trying to draw the excessive current probably won't function correctly, but everything should stay safe, not catch fire, and recover nicely once the excessive load is removed.
Ripple
No supply, even a regulated one, can keep its output voltage exactly at the rating. Usually due to the way the supply works, there will be some frequency at which the output oscillates a little, or ripples. With unregulated supplies, the ripple is a direct function of the input AC. Basic transformer unregulated supplies fed from 60 Hz AC will generally ripple at 120 Hz, for example. The ripple of unregulated supplies can be fairly large. To abuse the 12 volt 1 amp example again, the ripple could easily be a volt or two at full load (1 A output current). Regulated supplies are usually switchers and therefore ripple at the switching frequency. A regulated 12 V 1 A switcher might ripple ±50 mV at 250 kHz, for example. The maximum ripple might not be at maximum output current.
There are many reasons for this, and it isn't always obvious.
Years ago it was common for power supplies to output several rails. Usually +12, +5, and -12v, but other variations were common. Typically, most of the power was available on the +5v rail. +12v had the second largest amount of power. And -12v usually had the least.
But as digital logic started to run from lower voltages, an several interesting things happened.
The biggest thing is that the current went up. No great surprise, really. 12 watts at 12v is just 1 amp. But 12 watts at 1v requires 12 amps! Modern Intel CPU's might require 50+ amps at somewhere near 1 volt. But as current goes up, so does the voltage drop in the wires, and thus power is wasted. If the power supply is located at the end of a 1-2 foot cable then your power losses become large compared to if the power supply is located right next to the load. Also, having tight voltage regulation becomes more problematic due to the inductive effects of the cable. So the appropriate thing to do would be to have a higher voltage come out of the AC/DC power supply and then regulate it down to a lower voltage at the load. The industry seems to be using +12v as that higher power distribution voltage, although other voltages are not unheard of.
The other thing is that the number of power rails required on a PCB has become large. A recent system that I designed has the following rails: +48v, +15, +12, +6, +3.3, +2.5, +1.8, +1.5, +1.2, +1.0, and -15v. That's eleven power rails! Many of those were for analog circuits, but six of them were for digital logic alone. And as new chips are developed, the number of power rails is increasing and the voltages are decreasing.
What this has done to the AC/DC power supply industry is that they are standardizing on supplies with a single output rail, and that rail is usually +12v, +24v, or +48v-- with +12v being the most common by far. Since everyone started doing local DC/DC converters on their PCB, and most taking +12v in, this makes the most sense. Also, due to the volumes of supplies being made, a single +12v out supply is much easier to get and cheaper than just about any other supply.
There are, of course, other factors that should not be ignored. However, it is difficult to agree on much less explain their impact. I'll just briefly touch on them below...
When a PS company has to decide on what rails to manufacture they would end up with so many variations that they might as well build custom supplies. Unless they standardize on just a couple of common voltages with a single output.
When a PS does have multiple outputs, the current supplied on each output is usually wrong. Even just the +5, +12, and -12 supplies it used to be that most of the current was on the +5v rail. But today it would be on the +12v rail because of all of the downstream point of load supplies. Add the variations on how the power is distributed to the different rails to the already huge voltage options and for a simple 3 output supply you could easily end up with hundreds or thousands of variations on how to configure the supply.
When building supplies, volume matters. The more you make, the cheaper they can be. If you have a hundred variations of a supply then you have divided your volume for any one variation by 100. That means that your cost has gone up significantly. But if you build 4 variations then the volume can remain high and cost low.
If you have a specific need for what will be a high volume product then it is common to have a completely custom supply. In this case, a multiple-output supply might make sense.
Multiple output supplies tend to only regulate one rail, and allow the other rails to track that one and have looser regulation specs. This might not matter for some, but for the low-voltage rails used by modern digital logic this can be a killer.
So there you go: single-rail supplies are becoming more and more popular because of technology advances, ohms-law, and economics.
Update: I was talking about power supplies in general. The same basic concepts applies to both internal or external supplies.
Best Answer
You need a boost converter to lift your 24V DC to 240V DC. Because these are not 100% efficient the power in to supply 240V DC at 35 watts is likely to be about 40 watts.
Then you need to convert it back down to (say) 24 volts at the far end and this will probably mean the full input power will be about 45 watts.
With 40 watts at 240VDC "pumped" onto the cable by the booster the current will be 167 mA and now you have to decide on the wire gauge: -
If you choose AWG 24 - it has 84.2 ohms per 1000m. Double this for the return path and multiply by 2 for 2km length and you get a total resistance of 337 ohms - this needs to provide 167mA to the load and will accordingly lose voltage on the way.
Simple ohm's law tells you that the volts lost are 0.167 x 436 ohms = 56 volts. This would need to be added to the 240V DC you generate from the boost converter so immediately you are looking to generate more like 296 volts rather than 240 volts.
That's not a problem for the load (or the cable) but your input power has risen to about 50 watts.
AWG 16 has a 2km loop resistance of only 53 ohms so you may choose this instead. With 167 mA flowing the volt drop will be about 8.8 volts so maybe this is a better choice.
Hope this helps you get your head around the iterative process of choosing the right cable to suit your power budget.