Handwriting on an iPad – My Recommendations for Apps and Styli

 

IMG_5617Two years ago, I wrote this post about my search for the best handwriting tools for the iPad.  I reviewed both software (notebook apps in which to write), and styli (pens with which to write).

In summary, I recommended Noteshelf as my app of choice, and the Adonit Jot Flip Stylus as the best stylus.

Twenty-four months later, have I changed my mind?

Yes and no.

First, the Apps.

Screen Shot 2014-10-15 at 5.38.40 pmMost of the apps I reviewed have been updated and improved since I wrote that post.  I still think Noteshelf is the best of them. It is still much the same as it was but has made significant improvements.  The most notable improvement is that there is now an option to create notebooks that automatically sync with Evernote (as Penultimate did two years ago).  Unlike Penultimate, though, you can turn this feature on or off on a per-notebook basis.  I have one notebook that automatically syncs to Evernote. I use that one for quick notes, detailing phone conversations, etc.  It works great.  All other notebooks I prefer to sync manually when I am ready to sync, rather than continuously in the background.

The other feature I previously wrote about preferring in Penultimate, was the way it handles cut and paste using a lasso-drag-tap feature.  It’s incredibly cool and useful. Noteshelf now has that feature too.

I still think Noteshelf is the most responsive and accurate notebook app, with the best set of features. Not the cheapest (Penultimate and some others are free) but the best – even more clearly so than it was two years ago.

Second, the Styli

I still love the Adonit Jot Flip – and it’s still the stylus I turn to when I want to scribble notes on my iPad.  It’s just fantastic. It does have two significant shortcomings though.

  1. It really is not an appropriate choice for kids – because that little clear plastic disk is just too easy to break/lose. So is the screw-on cap.
  2. It’s no good if you are using a stylus to record a screencast in an app like Explain Everything, Educreations or Collaaj, because the app records the ‘tap-tap’ sounds of the hard plastic disk striking the glass and it’s quite distracting.

I’ve  tried a number of new styli since writing that original post.  Most significantly, I bought the Evernote edition Adonit Jot Script stylus (for more than AU$119), which features a fine nib – the idea of which really excited me.  More recently I’ve tried the Edugrip App Pencil.

Adonit Jot Script

Screen Shot 2014-10-15 at 4.50.42 pmI hate it.  If you are thinking of buying one of these, my advice is to borrow one and try it out first.  Maybe it’s just my handwriting style or something, but I find it laggy, unresponsive and inaccurate.  It requires AA batteries, needs to be switched on each time I want to use it, and needs to be synced via Bluetooth LE to the iPad.   I find the experience annoying, and my handwriting is not even nearly as neat as it is in any other stylus I’ve used! That’s just too much compromise to make for a fine point. Plus there is no pocket clip, no ball-point pen in the back, and compared to the Adonit Jot Flip, it feels cheap and plasticy, and it’s more than twice as expensive! Like the Adonit Jot Flip, it makes an audible “tap” when it touches the glass, so it’s no more appropriate for use in screencasting.  I never use it for anything.  I only keep it so I can show people who are thinking of buying one why they shouldn’t.

App Pencil

IMG_5606This stylus is a new offering, with an unashamedly educational focus.  The App Pencil is actually quite great for its intended market.  It’s basic, robust and inexpensive ($15).

It features a triangular transverse section (like those big grey-lead pencils kids use when they are learning to write) and the material is a sort of dense rubber – so it’s super-comfortable to hold.  It is all in one piece; there is no cap to lose, no plastic disk to break off.  “Unbreakable” and “Child” are two words that can’t be used together.  Nothing can withstand a determined kid, but i think this is about as resilient as a stylus could be expected to be!

What I really like about the App Pencil though, is that writing with it feels surprisingly good.  Most styli I’ve tried that have a rubber tip like this, are hard to write with because the rubber drags over the glass and feels blunt and numb.  I liken it to writing with an eraser. But the App Pencil feels better than most; it slides relatively smoothly over the glass.

Like all styli with blunt, rounded tips, it’s hard to form small characters (because you can’t see the point on the screen where the line is being formed) but for typical note-taking, diagram labelling, etc., it’s a pretty good experience.

The App Pencil has a rubber loop at the back end, that could be use to tie it with a string to the iPad (assuming the iPad has a case that provides something to tie it to).

I wish App Pencil were triangular along it’s entire length, so that it wouldn’t roll, but the ends are circular in transverse section, which means that it easily rolls across the desk.  (not that it will break, though, it just bounces when it hits the floor).

Edugrip claims that the App Pencil works with all Apple and Android tablets except the iPad Air.  That said, I have an iPad Air and it seems to work just fine for me.

It will be the stylus I use when screencasting with Explain Everything.  I’d also recommend it to any teacher planning to booklist a stylus for students to use at school.

What can teachers bring to the classroom, that has increasing value?

For the past 12 months I’ve been asking this question to teachers and school leaders in various forums:

What do teachers bring to the classroom that is still scarce now that we have Google, YouTube and Wikipedia?

As expected, I’ve received numerous answers to that question, and with a nod to ‘Family Feud‘, the top ten answers are on the board:

top ten responses

You’ll notice that “Providing Knowledge” is not on the list.  Twenty years ago, knowledge was one of the most valuable things a teacher contributed to the learning experience of students.  Now it doesn’t even make the top ten.

I think an equally valid question to ask is this: What can teachers bring to the classroom that not only still has value, but which has increasing value?

What can teachers bring to the classroom that has increasing value?

I’d be interested in your answers to that question. I have a few of my own, (which i’ll develop further in future posts.)

  • Critical thinking.
  • Mindfulness.
  • Wisdom.
  • Honest and constructive feedback.

Here’s the point: You can cut the ‘Class-time Pie’ anyway you want.  But if the largest slice is being given to standing at the front of the room disseminating a commodity of falling value, then less time can be devoted to really building a precious classroom experience for students.

classtime pie

 

 

 

 

 

 

 

Install Fonts on Your iPad

Have you ever crafted a Keynote slide or Pages document on your computer, thoughtfully selecting the perfect font, only to find that when you open the document on your iPad you are greeted with the message:

“The font FortuneCookie is missing.  Your text might look different.”

This morning I made a slide in Keynote on my Mac, and chose the font FortuneCookie.  My iPad replaced FortuneCookie with Helvetica Neue – a nice enough font but not the one I had chosen!

Or perhaps you are just bored by the small selection of fonts on the iPad and want to add a few.

Unknown to many people, it’s actually quite easy to install additional fonts on your iPad.  Start by downloading AnyFont ($2.49)

Screen Shot 2014-07-27 at 8.51.40 am

Fig 1. iTunes on my Mac, Showing the File Sharing section within the Apps tab, of the iPad.

Then connect your iPad to your computer, open iTunes, select your iPad in the devices list, click the Apps tab at the top, and scroll down to the File Sharing section.  Within the File Sharing section, you will see a list of all the apps that are available for file sharing.  Select AnyFont from that list, as shown in Fig 1.

Drag any true type font (.ttf) or open type (.otf) or true type collection (.ttc) from your computer to the left-hand pane titled “AnyFont Documents”.  If you have just installed AnyFont this pane of the window will be empty.  You can see that I have added 14 fonts.

iPadAir-anyf.jpg

Fig 2. Close up view of the AnyFont app on the iPad, with FortuneCookie.ttf selected. Tap the large icon to begin installing.

Now you can close iTunes on your computer and disconnect your iPad.  Open the AnyFont app on your iPad, and tap the font(s) you have just added.  The app will take you through a few steps to install the font.  (These steps feel unusual as you are doing them, but they are quite easy and safe).

That done, open an app such as Pages, and format some text and you will now see that your new font is available to use.

Screen Shot 2014-07-27 at 4.50.35 pm

Fig 3. My Keynote slide, looking nice with the font FortuneCookie.

That’s all there is to it.  Now you can make your presentations and documents look as nice on your iPad as you can on your Mac.

Remote-Control Keynote on iPad With Your iPhone

keynoteappnarrowThere are lots of options for giving a Keynote presentation from your iPad, most of which are sub-optimal.  This morning I discovered that I can do it over Bluetooth (not WiFi) from my iPhone. I didn’t know that was possible! It makes presenting from an iPad a realistic option.

Option 1 – AirPlay

You can AirPlay your iPad screen to an AppleTV, or to a computer running Reflector, AirServer or X-Mirage.  I’ve done that sometimes in the past, but there are four problems.  ① Sometimes AirPlay can be a little laggy – especially if you have video in the Keynote slides.  ② The iPad is a little bulky to hold while presenting. Its heft restricts hand gestures and I think, looks a bit awkward. ③ I find that sometimes when there are lots of WiFi devices in a room all accessing the WAP (as is usually the case in a classroom) the AirPlay connection tends to drop out altogether.  That happens too frequently to be viable.

Option 2 – Cable

You can hard-wire your iPad to to the projector using a VGA or HDMI cable and the appropriate lightning adaptor. But if you present from your iPad while hand-held and tethered to a cable; you’re likely to accidentally unplug it mid-stream when you trip over the cord. And if you are going to leave the iPad on the table, your own movement in the classroom is restricted (or you’ll be constantly dancing back and forth to change slides – which I think looks comical, and frankly, a bit amateurish.

Option 3: My new preferred option.

5cA third alternative is to connect your iPad to a cable, then use Keynote on your iPhone to control Keynote on your iPad. This has been possible for a while, but what has stopped me from doing it is that (I thought) it relies on having both iPhone and iPad connected to the same WiFi network.  I don’t trust that arrangement because I’ve sometimes found the whole “both-devices-on-the-same-network” thing to be a bit unreliable – with dropouts being too frequent.  Also in a number of schools in which I’ve worked, even though my devices are connected to the same WiFi network, they still can’t see each other (I’ve no idea why – but presumably there is something on the network preventing this kind of interoperability).

5c-600contBut I recently discovered that if you turn the WiFi off on your iPhone, you can use the Keynote app on your iPhone to control Keynote on iPad via Bluetooth (directly between the two devices). In my testing that provides a more robust link, making presentation from the iPad via iPhone viable.  While you do have to turn WiFi off on your iPhone, you can leave the iPad’s WiFi on which seems to work just fine. That is handy if you need to access the web, or another app during your presentation.

The iPhone app allows you to advance to the next slide, go back to the previous slide, navigate to any slide by number, and the iPhone’s screen displays the current slide together with either the next slide or your presenter notes.

It’s a pretty good set-up.  The iPad is directly connected via cable to the projector so video isn’t laggy.  The iPhone is not sending video via AirPlay – it’s just controlling the iPad – so it works reliably, and the iPhone sitting in the palm of your hand acts as a confidence monitor, too (so you don’t have to turn your back to the class to see what is on the slide – it’s in the palm of your hand!)

 

Tapes: A Ridiculously-Quick, Frictionless Screencasting Tool for Mac OS X.

AppIcon.175x175-75A while ago I wrote a post covering all the screencasting tools I could think of from expensive-and-complex at one end of the continuum to free-and-simple at the other. Since writing that post, I have discovered another screencasting tool that I am quite enamoured of.

Tapes is the simplest and fastest way to make a screencast I’ve ever seen. It’s quick. I mean really, really, quick to use.

Click on the Tapes menu bar item, choose “Record New Tape” and bang! you are recording. When you choose “Stop and Upload”, it instantly tells you that a link has already been placed on your clipboard. You can immediately paste that into an email or discussion thread, even as the video is still being uploaded in the background! It’s that easy and quick. Watch this little 1 minute demonstration to see what I mean. It’s really quite something.

It’s not the tool I’d use to make a full-featured screencast. But for a quick explanation, it just can’t be beat.

Tapes has a one-time purchase price of $12:99, which also gives you 60 minutes of recording each month (ongoing) but if you buy it from this promo code, you’ll get an extra 15 minutes per month.

If you are looking for a free alternative, QuickCast is similar but not so amazing.  For example, unlike Tapes, when you click to record, it gives you a 5 second count-in, whereas Tapes just starts recording.  Also with QuickCast, once you finish recording, you have to wait until the video has finished uploading before a share link becomes available. Furthermore, once your video has finished uploading in QuickCast you have to pull down the QuickCast menu and click on the video, to copy a share link, whereas Tapes does all that for you.

Those shortcomings in QuickCast might seem inconsequential, but they mean you’ll find yourself wasting minutes every time you make a screencast, whereas in Tapes – as soon as you’re finished recording, you can paste the link somewhere, and forget about it, moving on to the next task. That increase in efficiency is noticeable – and since efficiency is the core reason for wanting to use either of these apps in the first place, Tapes is the better choice.

 

Finally! Record the Screen of Your iPad in Any App, with Narration.

Yesterday X-Mirage added the ability to record not only your iPad screen and audio via Airplay, but also your voice narration.  I’ve been waiting for someone to implement this for ages.

First, Before we get to the details, here’s a little video I made to demonstrate how good the result is.

I’m a fan of iPad screencasting apps like Educreations, Collaaj and Explain Everything, but the limitation on all these apps is that they can only record within the app itself, due to Apple’s sandboxing policy. In other words you can’t use Explain Everything to make a video tutorial about how to change settings in the Settings app, or how to create an eBook in Book Creator or how to write a formula in Numbers or Excel.  Nor can you use them in combination with a content-based app to make a screencast explaining a topic.

X-Mirage is not the first computer application to allow video mirroring from an iOS device. It’s not even the first to provide a video recording function. In fact, both AirServer and Reflector have made this possible for some time now. But X-Mirage does something these others don’t. [Edit: actually AirServer has recently added this functionality too]. It allows you, via your computer’s microphone, to simultaneously record your voice. Now you can simply work on your iPad and describe what you are doing, and X-Mirage captures it all!. When you are finished, the video and two audio tracks are mixed down into an MP4 video and saved to your computer. [It seems that this is only possible on a Mac at this stage. The PC version of both X-Mirage and AirServer will record iPad video and audio, but not your voice - sorry PC users].

OLYMPUS DIGITAL CAMERA

X-Mirage is remarkably simple to use. With your iPad and computer connected to the same WiFi network, Launch X-Mirage on your computer, then swipe up from the bottom bezel of your iPad to bring up the Control Centre. Tap the AirPlay button at the bottom of the Control Centre, and choose the X-Mirage option and toggle mirroring to ‘on’. You will then see your iPad screen mirrored to your computer.

To start recording, click the (quite obvious) red ‘Record’ button at the right edge of the window. To record your microphone as well, you also need to click the smaller microphone button immediately to it’s left. That’s in fact the only part of this process that is anything less than child’s play – you have to click both those buttons when you start recording. First you need to start the video recording, and then once that is going, click the microphone button to start recording your voice.

x-miragewindow

Because X-Mirage uses your Mac’s microphone (not the microphone in your iPad) you do need to be aware of your computer’s proximity (Ie. you can’t be walking around with your iPad while recording). The upside of this, however is that if you have a good external microphone attached to your Mac, your screencast’s audio will benefit from better sound quality than if X-Mirage recorded from the iPad’s microphone directly.

X-Mirage is $16 with discounted educational pricing available from the website.

Even at the full price, It’s well worth the money (in my opinion).


 

PS. I do know that it’s possible to use Reflector (or AirServer) to mirror an iPad screen to a computer, then simultaneously use some other screencasting software on the computer to record what Reflector was displaying – I’ve done that myself a number of times.  It’s a lot of mucking around, though, and for all that effort you have to really want to make a screencast for it to be worth the effort!

 

Our Teachers Didn’t Have a Choice; We Do.

Many of the structures, processes and workflows that characterize schools, we’re designed by teachers in a time when there were no choices.  Our teachers were constrained by technological limitations that no longer constrain us (except for the limits we put on our own thinking).

Our teachers couldn’t choose where or when to interact with their students.  They had access to their students for a limited amount of class time each week. We do have a choice. We can engage with our students in class, but we also have a choice to interact with them via any one of numerous synchronous or asynchronous online platforms. 

Our teachers couldn’t choose how their students would publish their learning / ideas / stories / art / research. In fact, they couldn’t  choose to publish at all. Our students, on the other hand, have a dizzying array of available, socially relevant publishing options, the possibilities of which, ought to have us spinning in our chairs with excitement.

Our teachers didn’t have a choice about covering the whole curriculum in class – how else would their students be exposed to all the nuggets of knowledge they needed?  Our students have the Library of Alexandria at their fingertips! That gives us a choice that our teachers didn’t have, about which parts of the course to spend time on in class and which parts to let our students take responsibility for covering themselves – or with the help of a smorgasbord of online teachers / animations / forums / courses / tutorials / screencasts / podcasts.

Of course, just because we have a choice doesn’t mean we should necessarily exercise it in any given situation. Conceivably, the way our teachers did things might sometimes still be the best way.  But one would imagine that with all the choices now available, the old ways are unlikely to be most effective in a majority of situations.  (Though they are still practised in a majority of situations).

A gasp-worthy context-aware healing brush app for iPad

2074924066-roundedEvery device has it’s particular strengths and weaknesses, and while it’s true that there are certain tasks you can do on a computer that are not as easy to do on an iPad, the list of tasks that can’t be accomplished on an iPad is an ever-shrinking list.

Photo editing is a case in point.  It’s been possible to do basic photo-editing on the iPad for years now, but anything beyond basic has always sent me back to my computer.  That’s starting to change though, as new apps such as Handy Photo ($2.49)  emerge which let me do things for which I used to need Adobe Photoshop ($9.99 per month).

Context-aware healing is a great example.  In Adobe Photoshop the context aware healing brush lets you select an object in a photo that you want to remove and Photoshop will remove it and fill in the background intelligently – guessing at, and reconstructing what might have been behind the now-deleted object, based on its context.  (It’s quite magical to watch, really).

Amazingly, you can now do that on the iPad using a remarkable app called “Handy Photo“.  It’s very simple to use, and besides letting you delete objects, lets you move them and perform a range of other photoshop-like tricks that I haven’t seen in any other iPad apps. The gap between computer and iPad is closing rapidly.

Screen Shot 2014-05-22 at 11.41.57 am

 

 

 

 

 

 

 

I’ve been using Handy Photo for a while now (for personal use), but demonstrated it for the first time at a workshop the other day, to the amazed gasps of the audience.  I thought if it impressed them so much, it could be worth sharing here, too.

OCR With your iPad

Grab it while it’s hot! – at the time of writing this app is free!1431425451-rounded

Let’s say you find a newspaper article that is stunningly relevant to what you are teaching at the moment, but the language in the article is pitched at the wrong reading level.  Maybe you teach primary students and the AGE Journalist has used too much scientific jargon in an article that is also too long. How cool would it be to substitute a few words, and delete a few paragraphs? Or perhaps you teach VCE students and the Herald Sun editor has dumbed down the terminology to make it more … accessible, but you’d prefer to edit some of that meaningful terminology back in to the article to strengthen its ties to what your students are learning.

Scanner with OCR will allow you to take a photo of the article with your iPad’s camera, and it will ‘read’ the text in the photo and give you a plain text document that contains the entire article! You can edit the text, delete sections, copy and paste bits out into another document. It also means you can save the article to (say) Dropbox, and because it is text you will be able to search its contents, later.

There are other iOS apps that will OCR a scan. The most obvious one is Smile Software’s PDFpen Scan+ which has been around for a long time now (perhaps six months or more).   That app is expensive though – at AU$9:00. It’s also nowhere near as good (although the interface is more nifty and it will let you scan multiple pages). It’s OK for a standard typed page, such as a letter, but it has real trouble with columns of text, and in my experience it makes quite a few errors even with a standard page of text. It also seems to struggle with things such as bullet points.  In one document I scanned this afternoon, it interpreted one bullet point as a “$” and another as a “0”. Also once the OCR is completed, you have to export the entire document to (say) Pages, and then edit the text there; you can’t just select and copy a paragraph, like you can in Scanner with OCR.

Scanner with OCR on the other hand was practically flawless in my testing. It rarely makes a mistake, even with bullet points – and copes with multiple columns deftly.

Normally just AU$1.99 – today it’s totally free.

What To Do With The Extra Class Time? – Teach Like a Pirate!

If you flip your classroom – what will you do with the extra class time you free up?

If a teacher’s answer to that is that kids will do the work that would previously have been assigned to them as ‘homework’, then I feel a bit sad (for their students). There are so many more valuable things we could be doing with that precious ‘together time’ – things that can’t be done any time or anywhere else.Teach Like a Pirate on Kindle app

Book recommendation: Teach Like a Pirate, by Dave Burgess. It’s not an EdTech book. It really has nothing to do with technology – but in a way, that’s why I’m recommending it. The promise of the flipped classroom model is that class time will be freed up for … whatever you want to use that time for! In my view, that time is best spent doing things that students can’t do anywhere else. Those things are scarce, and if scarce, then valuable. Many of Burgess’ classroom ideas and strategies are time-consuming, (which will be the major blocker in a traditional classroom). But that’s my point really: If you are considering flipping your class and wondering what you might do with all that extra class time, I think Teach Like a Pirate will expand your thinking, or at least give you some inspiration for time-consuming but valuable learning experiences you could be giving your students.

I especially love the thought-provoking questions Burgess has included in the book. Here are three that I found particularly thought-provoking:

  1. If your students didn’t have to be there, would you be teaching in an empty room?
  2. Do students wait to go to the bathroom until their next period, because they are afraid they will miss something unforgettable in your room?
  3. Do you have any lessons you could sell tickets for?

In a world with Google, Wikipedia and YouTube – a world where information has lost it’s scarcity, teachers need to be ever-asking ourselves what do I bring to the classroom that is scarce?, because if I am really doing something valuable in my classroom, my answer  to Dave’s three questions above will be “No”, “Yes” and “Yes”.