(NB: If any of the below is unclear, please let me know and I will clarify.)
I recently acquired an 8TB USB 3.0 drive. In terms of power requirements, the drive is rated (as near as I can make out) at 12V 4A.
The drive has two FW800 ports on the back; I will be using one of those ports to transfer data. The drive also comes with a power cable containing a barrel-type connector on one end to plug into the drive and a USB-A plug on the other end to plug into a USB port of some kind for power...which is where the USB hub comes in.
The particular USB hub I am looking at to do this is described as "a universal AC 100 – 240 V to DC 5V 6A with 6 USB Ports Power Supply." The 6 USB ports are connected in parallel so the amps will be divided according the number of charging devices I plug in. I plan on plugging in two devices (a smartphone and the aforementioned USB drive), so presumably each USB port will be able to provide 3 amps.
ETA - For the sake of anyone faced with a similar problem in the future, this matter hangs on the following formula:
W = V * A (or: watts equals volts multiplied by amps)
In this case, we have a hard drive with a power supply drawing 12 volts at 4 amps, so: 12V * 4A = 48 Watts
Meanwhile, as suicidal_orange points out, the USB hub supplies 5 volts at 6 amps so: 5V * 6A = 30 Watts.
Bottom line: Plugging a 48 Watt device into a 30 Watt hub and expecting it to operate is not likely to meet with much success.
(Well, now I know.)