Social Cameras

*Ido Arev, *Hyun Soo Park, Yaser Sheikh, J. Hodgins, and Ariel Shamir, „Automatic Editing of Footage from Multiple Social Cameras “ ACM Transactions on Graphics (SIGGRAPH), 2014

We present an approach that takes multiple videos captured by social cameras—cameras that are carried or worn by members of the group involved in an activity—and produces a coherent „cut“ video of the activity. Footage from social cameras contains an intimate, personalized view that reflects the part of an event that was of importance to the camera operator. We leverage the insight that social cameras share the focus of attention of the people carrying (or wearing) them. We use this insight to determine where the important „content“ in a scene is taking place, and use it in conjunction with cinematographic guidelines to select which cameras to cut to and to determine the timing of those cuts. A trellis graph formulation is used to optimize an objective function that maximizes coverage of the important content in the scene, while respecting cinematographic guidelines such as the 180-degree rule and avoiding jump cuts. We demonstrate cuts of the videos in various styles and lengths for a number of scenarios, including sports games, street performance, family activities, and social get-togethers. We evaluate our results through an in-depth analysis of the cuts in the resulting videos and through comparison with videos produced by a professional editor and existing commercial solutions.

AI programs exhibit racial and gender biases, research reveals

The latest paper shows that some more troubling implicit biases seen in human psychology experiments are also readily acquired by algorithms. […] The AI system was more likely to associate European American names with pleasant words such as “gift” or “happy”, while African American names were more commonly associated with unpleasant words.

These biases can have a profound impact on human behaviour. One previous study showed that an identical CV is 50% more likely to result in an interview invitation if the candidate’s name is European American than if it is African American. The latest results suggest that algorithms, unless explicitly programmed to address this, will be riddled with the same social prejudices.

Hannah Devlin, AI programs exhibit racial and gender biases, research reveals