Dave mulls a poor choice of exit music, another hack targeting online love-seekers, and an emotional development in the world of AI
If you have been to as many channel vendor get-togethers as I have, you've probably heard I Gotta Feeling by The Black Eyed Peas roughly 10 million times and U2's Elevation at least three million.
Apart from being two of the most objectively terrible pieces of music to ever exist, both tunes have been around for quite a few years now, so some new chart smashers to liven up a business unit director's keynote are probably somewhat overdue.
In the last half-decade or so few artists have smashed the charts so reliably and with such nauseatingly catchy hooks as Bruno Mars. So it was no surprise that, following a morning of listening to rousing addresses at a vendor shindig last month, I and hundreds of other channel types exited the arena to the pop heavyweight's recent hit 24K Magic.
I have no doubt that it will prove to be another big hit for Mr Mars, but some of the song's racier content did make me wonder whether it was the best choice to pump out at ear-splitting, conversation-precluding volume to a room full of British reseller execs. We could just about live with the frequent four-letter words, and we managed to grit our teeth and stare at the floor following the reference to "bad bitches and ya ugly ass friends".
But when Mars suggestively cooed about pretty girls waking up his rocket, I think I speak for all of us when I say that, at that point, we would have been happy for the earth to be torn asunder and for the ground to swallow us whole.
And what's worse: we wouldn't have minded going back to The Black Eyed Peas.
Dating websites are a perennially popular target for the cybercriminals of this world, so it was only a matter of time before Harry and Henrietta Hacker took aim at the online romance tool of choice for the metropolitan media elite: Guardian Soulmates.
The recent breach saw the exposure of the email addresses and online usernames of a number of subscribers to the service, which was launched by the newspaper in 2004. Full details were yet to emerge as we went to press, but a statement by the publication's brass indicated that "our ongoing investigations point to a human error by one of our third-party technology providers".
"We take matters of data security extremely seriously and have conducted thorough audits and are confident that no outside party breached any of these systems," the statement added.
"We have taken appropriate measures to ensure this does not happen again."
The Guardian took pains to outline that the problem has been fixed and that no highly sensitive information - such as birthdates or credit card details - was compromised by the breach. Although a number of users' private messages concerning favourite quinoa recipes, critical assessments of the films of François Truffaut, and what is to be done about the shadow cabinet were leaked. Probably.
Humankind's eternal enslavement in a self-created cage of obsolescence inched closer this week, with news that techno bods in China have developed a chatbot capable of mimicking human emotions.
The technology is known as an ECM - emotional chatting machine. (And if you're expecting a cheap sexist joke on the back of that name, then you're obviously the kind of boorish, unenlightened goon who hasn't learned the valuable life lessons of several employment tribunals, a Sensitivity in the Workplace course, and the court-mandated appointment of a chief inclusion officer.)
The ECM represents a step change in sophistication levels from other chatbots, as it can not only respond to users with factually and contextually appropriate answers, but can also inject its comments with emotional reactions, including sadness, happiness, or disgust. A paper accompanying the technology's release revealed that, in a study, some 61 per cent of users claimed they preferred talking to the ECM than to a neutral chatbot.
The development came after processing data from 23,000 posts on Chinese social media site Weibo, each of which was classified by its primary emotion. Recognising these differing feelings allowed the ECM to respond to a human's gripe about traffic delays with support, encouragement, or sympathy - "sometimes life just sucks!".
If it can recognise basic human emotions, then this thing is one up on most of my sales team. I wonder if they can teach it to quote on a server refresh…
■ Dave Diamond-Geezer, director of Digital Online Deals and Global Integration (Dodgi) of Dagenham Ltd