Using “web crawler” software designed to search, index and back up a website, Mr. Snowden “scraped data out of our systems” while he went about his day job, according to a senior intelligence official. “We do not believe this was an individual sitting at a machine and downloading this much material in sequence,” the official said. The process, he added, was “quite automated.”
he essentially crawled NSA's internal wiki. apparently an internal crawler would have gotten blocked by security protocols at NSA headquarters that hadn't yet been implemented in the Honolulu NSA outposts where Snowden worked for Dell and then Booz Allen Hamilton. i'd love to know which software he used.
In interviews, officials declined to say which web crawler Mr. Snowden had used, or whether he had written some of the software himself.
Officials say web crawlers are almost never used on the N.S.A.’s internal systems, making it all the more inexplicable that the one used by Mr. Snowden did not set off alarms as it copied intelligence and military documents stored in the N.S.A.’s systems and linked through the agency’s internal equivalent of Wikipedia.