Skip to content

Archives

Links for 2013-08-05

  • Filters ‘not a silver bullet’ that will stop perverts, warns Interpol chief – Independent.ie

    Sunday Independent interview with Interpol assistant director Mick Moran:

    Moran spoke out after child welfare organisations here called on the Government to follow the UK’s example by placing anti-pornography filters on Irish home broadband connections. The Irish Society for the Prevention of Cruelty to Children argued that pornography was damaging to young children and should be removed from their line of sight. But Moran warned this would only lull parents into a false sense of security. “If we imagine the access people had to porn in the past – that access is now complete and total. They have access to the most horrific material out there. We now need to focus on parental responsibility about how kids are using the internet.”

    (tags: mick-moran cam interpol policing ispcc filtering parenting children broadband)

  • Coordinated Omission

    Gil Tene raises an extremely good point about load testing, high-percentile response-time measurement, and behaviour when testing a system under load:

    I’ve been harping for a while now about a common measurement technique problem I call “Coordinated Omission” for a while, which can often render percentile data useless. […] I believe that this problem occurs extremely frequently in test results, but it’s usually hard to deduce it’s existence purely from the final data reported. But every once in a while, I see test results where the data provided is enough to demonstrate the huge percentile-misreporting effect of Coordinated Omission based purely on the summary report. I ran into just such a case in Attila’s cool posting about log4j2’s truly amazing performance, so I decided to avoid polluting his thread with an elongated discussion of how to compute 99.9%’ile data, and started this topic here. That thread should really be about how cool log4j2 is, and I’m certain that it really is cool, even after you correct the measurements. […] Basically, I think that the 99.99% observation computation is wrong, and demonstrably (using the data in the graph data posted) exhibits the classic “coordinated omission” measurement problem I’ve been preaching about. This test is not alone in exhibiting this, and there is nothing to be ashamed of when you find yourself making this mistake. I only figured it out after doing it myself many many times, and then I noticed that everyone else seems to also be doing it but most of them haven’t yet figured it out. In fact, I run into this issue so often in percentile reporting and load testing that I’m starting to wonder if coordinated omission is there in 99.9% of latency tests ;-)

    (tags: measurement testing latency load-testing gil-tene coordinated-omission validity log4j percentiles)

  • Xerox scanners/photocopiers randomly alter numbers in scanned documents · D. Kriesel

    Pretty major Xerox fail: photocopied/scanned docs are found to have replaced the digit ‘6’ with ‘8’, due to a poor choice of compression techniques:

    Several mails I got suggest that the xerox machines use JBIG2 for compression. This algorithm creates a dictionary of image patches it finds “similar”. Those patches then get reused instead of the original image data, as long as the error generated by them is not “too high”. Makes sense. This also would explain, why the error occurs when scanning letters or numbers in low resolution (still readable, though). In this case, the letter size is close to the patch size of JBIG2, and whole “similar” letters or even letter blocks get replaced by each other.

    (tags: jbig2 compression xerox photocopying scanning documents fonts arial image-compression images)

Comments closed