I say it is reasonably good in that they get some details right on the tools and algorithms used to analyse communications including (even thought they don;t use the term) link analysis, emergent grouping and other statistical analysis methodologies that allow systems and analyst to isolate the abnormal from teh billions of normal transactions in the data.
Over the last week, critics and defenders of the National Security Agency have heatedly debated the merits of metadata – information about the phone activity of millions of Americans that was given to the government via a secret court order.
The information collected includes records of every call placed on the Verizon communications network (and, it appears, every other U.S. phone carrier) including times, dates, lengths of calls, and the phone numbers of the participants, but not the names associated with the accounts.
For some, the collection of these data represent a grave violation of the privacy of American citizens. For others, the privacy issue is negligible, as long as it helps keep us safe from terrorism.
There are indeed privacy issues at play here, but they aren’t necessarily the obvious ones. In order to put the most important questions into context, consider the following illustration of a metadata analysis using sample data derived from a real social network. The sample data isn’t derived from telephone records, but it’s close enough to give a sense of the analysis challenges and privacy issues in play. Read more »