I’m hoping to do some formal screencasting, educational videos for iOS and Rails, in the future so for practice I hacked together this little Introduction to Objective-C Categories to try some stuff out. It’s not the worst thing ever so I thought I’d share it and my notes.
A short introduction to Objective-C categories for iOS and Mac developers.
As far as what I was trying to learn through the process of making this…
What is a good resolution to shoot at?
Ended up trying
1920x1080 which in TV speak is
1080p. This I think worked well. There is enough screen real estate to show Xcode with all “widgets” open, plus enough room for a side app like the iPhone Simulator. Speaking of which, 1080p barely squeezes both portrait iPhone retina and portrait iPad non-retina. Finally, should this ever be pipped out to a TV it should be full screen with no scaling.
What I don’t like is how tiny the file browser and non-source text can be. I envision zooming in on occasion to have those read well when needed.
How is Vimeo these days?
I’ve put up a few videos in the past on Vimeo but for this new project I’m considering using their Pro service so I saw this as an opportunity to play around with their stuff.
Overall things seem good. They are really good about suggesting codec and bit-rate changes to get the most from their platform. They also provide a nice HTML 5 version of their player.
Why not YouTube? Long term I could see some of this content becoming pay-for or subscriber/membership-based and YouTube isn’t really good for that.
To record audio before video or with video?
For the majority of my previous screencast work my typical process included recording the audio on its own and then recording the video, matching everything up in editing. The result is a nice, tight video without any real hesitations or pauses.
For this video I did things more casual. I had a list of things I wanted to demonstrate and recorded my voice right with along the video. There are pros and cons to this.
- I do enjoy the personality that comes from this style. To hear the typing and a few ums makes a human connection.
- If done well, it can shorten overall capture time.
- It lends itself to camera shots of the speaker, which again can help create a human connection.
- Doing the video live with the audio is much, much harder to perform. It’s easy to miss things you intended to showoff (I did so in this video.)
- I myself have bad allergies and tend to breath into the mic. If recording the audio on its own it’s easier to isolate this.
- Some people will not like hearing the typing.
- If the screencast is based off a fixed script I’ll be able to post a text version easier, which is extremely valuable (for Google-food as well as people who pref text over videos).
In the end I think I’ll be going back to audio only first, then screen recording but welcome your feedback.
Other random observations:
- Probably want to hide the dock for more “Xcode space”.
- Those early “title slides” were done in Keynote. Works great for this kind of stuff, especially animations to explain abstract concepts.
- In the future I won’t be typing everything. Longer code will be uncommented in place or dragged in from snippets.
- Could have done some some zooming to help visualize things like the new file Xcode sheet, schema editor.
- Need to overlay URLs in large type when promoting a website.
Not sure how fast we’ll see real production start on these videos as I do have a few things already cooking but I don’t mind too much as it’s good to be busy. 🙂