

If these three things could be fixed it would be amazing. Or, if you're catching a moving subject, you need to "guess" where the subject is in relation to the camera since your visual feedback is delayed. So, when you reframe for example, it takes the phone screen a second to register the movement and display it.

This means that as you stream, it drifts slowly out of sync. The visual feedback on the display is synced to the timing of the stream output, not the output from the camera. If you forget to do this manually, you'll have some ugly moments while broadcasting. There's no orientation lock option in the app, which means you need to lock your entire phone in portrait mode and then manually offset the input in OBS by 90 degrees to make it display correctly.

Your colours will shift back and forth as you broadcast and there's no way to correct this. The iphone is set to full auto mode, which means auto white balance and touch-to-expose being your only exposure option. This is a great idea (allowing for high-bandwidth cabled input of iPhone source) but the current implementation has a couple of large/annoying drawbacks :
