By pattern. A human being probably won't be submitting requests at the exact same interval for a long series of requests. Or he won't be submitting requests for 48 hours straight, or every 2 ms. In other words, human requests should have less regularity and volumn.
It was probably mostly the volume since I had a random number generator for intervals (not to disguise, just out of courtesy). It was not exactly a lot of traffic (compared to what they receive anyway, but it must have been too long and (still) too regular.
Automated requests may lack browser identification as well (if request to webpage, not web service).
Of course, a best randomized disguise would be randomizing across internet domains but that won't be practical for most (law-abiding) people. Temporal randomization might not mean much if the time interval is too short, especially when shorter than the sampling intervals in which the series are analyzed.