Lately I am thinking about camera equipment, specifically professional, studio grade equipment. There have been really cool innovations around computational photography in smartphones lately, for example with night sight. Google is taking a more algorithmic / machine learning approach and Apple seemingly a more physical / sensor based approach to this concept. In any case they challenge the notion of what a “photo” is and extend the environment in which photography can be used.
Also, I’ve been working out of a professional photo studio for the last couple of weeks (disclosure: It’s my wife’s studio 😊) and I was able to have a closer look on the technology they are using. Cameras are still mostly focused on the optical quality of lenses and the raw abilities of the sensor. This makes sense as both are a clear differentiator from smartphones and a field where they can’t catch up (without massively compromising on size and weight).
And for work I’ve been thinking and writing about IoT, intelligent edge and dynamic physical spaces. One aspect are that connected smart devices will move towards mobility or rather movability to allow adapting to different scenarios and contexts.
Combining all three, I think there is a good potential for (semi-) professional photo gear to utilize connected & movable devices, machine learning and augmented reality.
Motorized internal flashes
Watching professional photographers using external flashes is interesting. They are regularly adapting the direction of the flash based on the lighting conditions, the environment and the situation. Even without looking, they reach up, rotate the head and continue taking photos.
Contrast this to the internal flashes of consumer or semi-professional cameras. They are usually fixed and everybody agrees they create this harsh, direct light that is not very flattering or desirable. It’s not that using the flash is bad per se, but it’s about the direction and intensity. Imagine a smarter, motorized version of built-in flashes that emulate how professional photographers would use them. The camera could analyze what’s in the shot, the current lighting conditions, the environment (where light would bounce off) and then rotate and set the flash according to the identified mood or an explicitly chosen program: “Portrait”, “Group”, “Action”, “Night” and so on.
The Canon 470EX-AI does something like this already. Its motorized head “automatically adjusts itself to ensure the correct angle for natural flattering lighting”. Then again, this is a 400€ accessory mostly aimed at on-location documentation photographers that need to get the perfect shot with very little prep time (think weddings or events).
Prediction: We will see something similar built into small point-and-shoot cameras as well as semi-professional range cameras soon.
Motorized studio flashes
Looking at actual studio work, we can take this principle one step further, though. A flash in a studio is more like a constantly moving piece of furniture that is arranged as needed.
With product photography the light and scene is carefully set up, which usually takes a lot of time. The product has to be lit perfectly, shadows carefully placed, reflections avoided or otherwise controlled. Even small products might command large setups with multiple light sources. Furniture or “living room” shots are usually entirely artificial as well where “the room” would be only one or two walls in the frame, propped up inside a large warehouse-style floor.
As I watched the prep work for these shootings, I noticed that the camera is placed first. Afterwards it’s a lot of checking the camera, walking to the light, minutely moving it, checking the result, repeat. Even as a team (one checking the image, one moving the lights), there is still a lot of walking around as one change here might affect another light there.
Some of this is already being automated. For example Profoto has remote control application to be able to change the intensity and color temperature of their studio flashes.
So how about a tripod mount that would accept voice commands and then move the light accordingly? That doesn’t sound too complicated and it takes about 15 minutes to scribble up a concept, maybe a day to also add a simple proof of concept with simple parts.
A connector to standard 1/4" tripod studs on one end, a battery powered microcontroller, running some form of voice recognition software, ideally with intent recognition for conversational commands, two servo motors (pan, tilt) operated via the software, a microphone for listening, maybe a camera for additional shenanigans and a couple of status LEDs, and finally a standard 1/4" stud on the other end to attach the studio flash. Put everything into a super sturdy shell and that’s it, right? Pretty much all components are technologically mature and it shouldn’t be too hard to pull together a product like this.
Apart from third party solutions motorized functions will be included in flashes and tripods directly. Especially tripods would also add the benefit of controlling the height as well as the pan and tilt. That said since flashes and tripods are long-term investments for studios, third party solutions make sense here.
Prediction: We will see third party solutions like this sooner than later. It will take a moment, but capabilities will also be included into studio tripods and flashes directly.
And then I remembered this:
Yes, add the camera, point to a position and the entire tripod will drive there! Entire setups can be saved and the lights will drive to the specific position! Ok, that’s maybe taking the scenario too far. Let’s look at more reasonable things, specifically adding augmented reality to the mix.
Full studio augmentation
Being able to precisely set up studio flashes is tricky since you don’t see the light, except when you take a picture. To get around this, most studio flashes use modeling lamps. These are (compared to the actual flash) low powered light bulbs that can be turned on while setting up the scene. However this it is still vague, especially with multiple light sources that might interfere with each other or with certain types of product shoots. Setting those up takes a lot of time and experience.
Imagine using augmented reality to visualize exactly where the flash is pointing by adding a directional “laser sight”. It could also show the primary path of light as well as potential reflections bases on the identified geometry. Machine learning could help identify the usual set of softboxes, reflectors and backgrounds, providing the basis for the required calculations.
The application could even simulate the resulting picture based on the current settings of the flash (color, temperature, intensity), showing “heat maps” where the light currently concentrates and maybe leads to overexposure.
Prediction: Professional studio equipment manufacturers will not only add physical capabilities, but also provide apps that go well beyond just remote controlling their equipment, into augmented reality.
Now combine this with the concept above to be able to tap on a specific light and re-orient it, change the light temperature and power settings, seeing the simulated results in real time.
And then subtract the phone and include the capabilities into the camera directly, being able to see this in the (digital) viewfinder of the camera or connected viewfinder app / PC.
Prediction: Manufacturers of professional cameras will follow implementing these software features as well, but it will take a while to negotiate communication protocols and commands with studio equipment manufacturers.
All of this is not strictly necessary. The scenarios do not allow to create entirely new things with photography or new workflows, but supports and streamlines the existing processes. But hey, sometimes using new technologies for good old continuing product innovation is perfectly fine.
What do you think? Any other interesting scenarios that you anticipate in the professional studio photography space? Feel free to comment!