The power supply doesn't always output its maximum current (unless needed), instead it outputs a particular voltage (in your case 5v) and the load presented by the device determines how much current will be drawn. If there is nothing connected to the output, the current will be zero.
However if the load attempts to draw too much current (i.e. more than 100 mA for a USB port that is rated for only 100 mA), then depending on the power supply, the voltage may sink below 5v, or the supply may stop working altogether due to a over-current shut-down mechanism.
As you stated, the current is determined ohms law, i.e. the voltage divided by the equivalent resistance of the load. As you already calculated, for a 10 mA current, the equivalent resistance of your load is 500 Ω. So if you replaced your device by a 500 Ω resistor, it would draw 10 mA. Obviously your device is much more complicated than a simple resistor, but that's what it looks like to the power supply.
In many cases, the load will not be a fixed amount. A trivial case is two LED's, each drawing 20 mA. One is on steady, and the other is blinking on and off. So the load varies between 20 mA and 40 mA. The supply will automatically adjust for this varying current.
Unfortunately, yes, if the USB 2.0 device seeks the USB-IF certification, it must obey the sequence 100mA -> USB_connect -> enumerated/configured -> full draw of port current. The "full draw" however should be determined by sensing the value of pull-up on attached CC pin.
If the attached cable is a "legacy cable" compliant to Type-C specifications, it should have 56k built-in pullup (you generally don't need any adapter, the cable has the pull-up).
If the attached cable is the "Type-C Standard Cable" connected to another Type-C port, the CC pull-up will be defined by the sourcing port, whichever it is capable of, 1.5 or 3A.
If you want your device to be practically charged, and charged faster, it is advisable to implement a battery-charging detector IC, at least to determine if the port supports Chinese-style charger signature, D+ connected to D-.
To really comply with USB-IF test specifications when using Type-C connector, you need to consult very carefully with this document, Type_C_Functional_Test_Specifications. This is an evolving area, so check for updates.
But if you don't bother with exhaustive USB-IF certification (as most manufacturers do), just take 500mA if it is enough for you, since every reputable host USB port must unconditionally support 500mA of sourcing (except nearly non-existing subset of low-powered portable gadgets running form tiny batteries).
Best Answer
In a standard (500mA) situation you don't need any ICs to make your device with Type-C connector. All you need is to connect D+/D- from Side-A and Side-B together, and put two pull-down resistors (5.1k) on BOTH CC1 and CC2 pins.
However, if you plan to take 1.5 A (or 3 A) out of the cable, you will need to check the voltage level on CC1 or CC2 pins. This level will depend on host port advertised capability. A Type-C host with 3A capability will have 10 k pull-up to 5 V, a 1.5 A capability will be advertised with 22k pull-up, and the standard port capability (500 mA) will have 56 k pull-up. If you don't want to overload a legacy host, you should watch for CC levels, and limit your power consumption accordingly.
Limiting power intake might be challenging without some intelligent IC, but there are several manufacturers (Texas Instruments, NXP, Microchip, Rohm, etc.) who can offer a solution.