In theory, per file is weaker, since you give the attacker potentially valuable information on every file versus one giant encrypted disk (where an attacker has no idea whether the disk has any files on it or not). In practice, both are strong. Personally for sensitive data, I'll encrypt the sensitive file (e.g., a keepass database) on an encrypted disk -- that way when my system is running (and the system can read the unencrypted disk) the sensitive file is still protected (unless I've specifically opened it); and if my disk was stolen or someone tried changing data on my hard drive from a live cd, the data as a whole would be protected.
From file sizes, modification times and other meta-data, the attacker may be able to match some standard system files with their original versions and this information could in principal be of use in a known-plaintext attack. However, modern ciphers try and prevent against this type of attack by using large random and unique initialization vectors (among other techniques); however note the weakness in say WEP wireless encryption fundamentally was due to short IVs (only 24 bits) so only about 2^12 ~4096 messages needed to be captured before the same IV reappeared (so from guessing one plaintext you can recover other plaintexts).
There could also be extra information that an attacker could obtain from file sizes/times on the per file method even if they can't decrypt them; e.g., you could tell by the number of files/accessed date if people are currently actively working on a project or not, etc.