Looks like they suggest a 2A power supply, so using that and a 12' 20AWG cable at 2A as the current draw, you will lose roughly 0.5 volts at the other end to loss in the additional cable. If the current draw is lower, there will be lower loss.
Their own supplied cable is 24AWG at 6', which at 2A will also drop 0.6 volts at the end.
So, looks like if you string both in series at the maximum of 2A, you will likely see an almost 1V drop by the other end. It all likelihood, the current will be lower during operation and the voltage drop not as significant.
Basic Voltage Drop Law
V
drop = IR
where:
I : the current through the object, measured in amperes
R : the resistance of the wires, measured in ohms
So, for example, if you are actually only drawing 0.5A during operation, the voltage drop will only be in the area of 0.3 Volts...probably good to go.
I would do some test measurements to see what voltage you are ACTUALLY seeing on the end of the factory plug NO load, and then with load (you can then figure out the actual current draw - even easier if you have a clamp meter)....dropping 1V in total may be problematic....depends on the actual need of the HM, and if the supply is actually putting out 12V, or something higher (or lower). Running a device on too low a voltage can damage it...
Some devices will specify a RANGE of acceptable voltages and most usually have some play as well.