Note: While I recently dealt with my flight from Twitter, I’ve had some more time to think about my relative experiences, some of my motivations and changing expectations, and just where it is I feel I am heading. While originally meant as a series of Mastodon posts, it seemed to quickly outgrow that mode of expression. This topic remains, and may always remain, a work in progress for me…
I’ve been full time on Mastodon (here) since the deal closed, but had preempted that by registering soon after the deal was announced so I could start to settle in, and I fired up my own instance not long after that.
I couldn’t abide staying on Twitter (there) long term, even if the deal fell through, simply because of who the shareholders and board were choosing to approve the sale to.
Like many, I’d built a comfortable set of follows and followers on Twitter, and knew it would take a while to settle in here. I honestly resented the board for recommending the deal, although from a fiduciary standpoint I understood why they would pursue it given the tanking tech stock market.
While there are many I follow here (or am followed by here) who were on Twitter, it’s certainly not a complete overlap. But it’s certainly at least as interesting a mix!
I’m not sure if tools like Movetodon and Fedifinder will capture many more of the Twitter accounts I followed in their new home here if Twitter continues its URL ban. I may need to rely on others finding me while my Mastodon-referring Twitter account remains active, or just through mutual follows or serendipity.
As per my earlier Mastodon post, I struggle with potential rationalisations of people I followed on Twitter who it baffles me have not yet come over (or at least left Twitter).
I know for many disadvantaged folks the community there can be a literal lifeline (and I hope that can become the case here), but for those not facing those challenges, ignoring (or abetting) the cesspit that it’s becoming just seems like an especially wilful sort of ignorance to me.
And there’s certainly no point staying to try and fight the good fight – that fight is so heavily weighted in the opposition’s favour it’s no fight at all.
The “Mastodon’s too hard“ argument doesn’t cut it for me – Twitter was hard once (I can’t tell you how difficult it was for me to learn direct messages in the early days, or how to effectively use the . tagging method!), Facebook has a learning curve…in fact MySpace, Insta, Yahoo Groups – they all had learning curves!
For some, I think it’s a handy and especially wilful sort of laziness to just stay where they are. As in physical reality, inertia can be pretty powerful.
I’d originally intended on leaving my accounts there as zombie accounts to prevent my handles being overtaken, but that is seeming less and less useful as time goes by. If I’m never going back (and I am never going back), what do I care if my handles are snaffled? They weren’t even my first choices!
But I haven’t decided between just deactivating or deleting tweets then deactivating (would be interested in pros and cons). And I do want some more time for my inactive accounts to grab a few more of my contacts from there.
As it started to be the case on Twitter, I find I’m struggling to keep up with my timeline, but I think a not insignificant part of that is discussion about Twitter, so I’m going to filter relevant terms and obfuscations to improve the signal-to-Musk ratio. I don’t quite feel ready to unfollow anyone at the moment, so I’m hoping that brings things back under control. At least I’m not suffering the sort of low level anxiety I did on Twitter at not keeping up.
At least I can check in on my instance’s Federated Timeline every now and then in case I feel I’d like an update on Twitter goings on – I do still care about what’s happening over there, but I need some clearly delineated space, too.I
I very quickly settled on a strict rule that I don’t cross-post from my main account – Twitter gets nothing from me now.
And while I intended to extend this to phasing out posting to the Applesauce Fluxes Twitter account, I cut that offf early after one of the many egregious decisions Musk made (I think it was unbanning Trump, but it’s honestly all becoming a blur now).
I’ve seen so many interesting introduction posts here, not all overlapping my interests, but I’ve made an effort to boost as many as I remember to in case they overlap my followers’ interests. Now is the time for community building, and this seems a low effort contribution I can make towards that goal.
I still have to get in the habit of using more #HashTags (don’t forget to #CamelCase!), but I think I’ve been remembering image descriptions/alt-text pretty reliably – CamelCasing hashtags and adding image descriptions are low cost (especially for what my time is worth!) ways of supporting accessibility here which I endorse wholeheartedly.
I know hashtags improve discoverability, but I’m not on a “get followed” drive, which is why I may have a lower impetus to actually utilise them more (for now).
I think that’s pretty well it at this stage. I’m enjoying Mastodon, recommend it wholeheartedly, and am still considering other fediverse usages. But for now, I really want to bed Mastodon down and feel as comfortable as I can.
P.S. Oh, and enough with the “John Mastodon” stuff already – I personally think owning that RWNJ would have been better by saying he either misread #JoanMastodon as #JohnMastodon, or, in typical RWNJ fashion, downplayed any role Joan Mastodon, John’s partner/mother/predecessor/whatever, had in establishing Mastodon the social network. And now I want to subvert the subversion, but it’s probably too late…
I’ve been giving lots of time recently to thinking about the preservation of retrocomputer-related print media, such as books, manuals, etc.
These thoughts have primarily revolved around “destructive” vs. “non-destructive” digitisation of these items, and how those digitisation methods fit into the broader sphere of “preservation” in retrocomputing.
Bound items such as books introduce physical complexities to the digitisation effort as they are not readily scannable on a flatbed or sheet-fed scanner – one way to speed the process is to remove the spine, most often by way of a guillotine, leaving loose sheets which can be quickly scanned in a sheet-feed scanner. This method is used with saddle-stitched (aka “staple-bound”) publications (including magazines) as well as perfect-bound or case-bound books.
This, of course, has irreversibly altered the physical nature of the item, and is accordingly described as “destructive” scanning – its opposite, “non-destructive” scanning (appropriately), leaves the physical item intact during the scanning process.
(As a side note, it is possible with some “mechanically-bound” and saddle-stitched items to remove the binding to allow sheet-fed scanning – the binding is then replaced, restoring the item to its former state. I consider such re-binding as a non-destructive process if the item is, for all intents and purposes, returned to its original condition. It can be difficult, however, to re-create the binding as originally applied without the right equipment for the method used.)
Several years ago I destructively digitised the three editions of Lon Poole’s original Apple II User’s Guide. Once scanned, I intended to recreate the books in InDesign, replicating fonts, layout, images, etc. – a true re-creation.
Guillotining the spines off and sheet-feeding seemed the quickest and easiest way to get undistorted scans of all the pages (to be used as page templates during replication), and I used the worst-condition copy I owned of each of the editions (I had bought multiple copies of the editions for just this purpose).
As seems to invariably happen around the retrocomputing hobby, however, real life got in the way and the scans are sitting on my computer pretty much untouched, and not much re-creating has happened.
I now deeply regret guillotining even these extra, not-the-best-condition copies and believe destructive digitisation should be avoided in all but the most extreme of circumstances. If there’s no need to remove the spine, it shouldn’t be removed.
So, what’s changed in the four and a half years since I guillotined those books? Basically, scanner technology has changed, and there are now viable alternatives which allow undistorted digitising of bound print items without spine removal.
These viable alternatives do not in my view include flat-bed scanning systems such as the Zeutschel zeta scanner system. I know of a local Apple ][ enthusiast/preservationist who has had extensive experience with that system and he reports that the software deskewing/distortion removal never lived up to the promise his then employer had been sold on by Zeutschel representatives.
Those disappointing results really don’t surprise me – although such distortion removal is “only” a mathematical problem, real life is rarely as neat as mathematics would suggest. But that sort of flat-bed system isn’t the only non-destructive book scanning technology available, and I’d suggest will never work as well as the sort of system I’m thinking of.
What has changed my mind forever on destructive digitisation is exemplified by the Scribe book scanner from the Internet Archive.
Systems such as the Scribe non-destructively scan bound books while avoiding any skewing or distortion in the captured image. They do this by sitting the books in a V-shaped bed, having clear perspex or glass sheets press gently down on the pages to flatten them, and taking photos of the pages using two cameras, each mounted perpendicularly to the page they’re capturing.
The zeta system the local enthusiast had experience with cost AU$15,000, and before I saw the pricing for the Scribe I thought it would be similar – at US$13,000, it’s currently a little over the money (at today’s exchange rates, that’s AU$17,000).
However, DIY systems based on this concept are already becoming available via makerspaces (such as Robos and Dinos here in Sydney, of which I’m a member), and hobbyist versions are already available in kit form, much as kit-form 3D printers can be purchased.
At US$1,620 (including cameras), this seems like a relatively inexpensive way to go down the non-destructive digitisation path. I do appreciate, however, “relatively inexpensive” does not automatically mean “affordable”: I know I can’t afford to buy one of these scanner kits at the moment, much as I’d like to.
I’ll be demoing the Robos and Dinos book scanner at WOzFest PR#6 – my aim is to choose a title on the day (not too large, maybe 100-200 pages) and scan and post-process it throughout the event. I’m hoping to have the resulting digitised book uploaded to the Internet Archive by the time everyone leaves.
A major disadvantage of these book scanners is limited availability, which is likely to be true for some time to come. However, although these scanners are not yet as readily available as sheet feed scanners such as the Fujitsu ScanSnap, I believe print material preservation has less urgency than software preservation as books don’t suffer bit-rot like disks inevitably will.
We can afford to wait for an Internet Archive partner centre to open up here in Australia, or for a local makerspace to get such a scanner, or for a community member to make one themselves, or for a community member to be in a position to scan items in this way on behalf of the community.
A disadvantage of these scanners is the need to turn the pages manually, which increases the time to scan an item. The Robos and Dinos scanner has a counter-weighted system to hold the perspex down. This is easily lifted to turn the page, which reduces the time between scans, but this system is still not as fast as an automatic sheet-feeding scanner.
Post-processing is another area where the kit and DIY book scanners currently fall behind commercial sheet-fed scanners. They often rely on open source software for not only capturing the page scans, but also for cropping and doing other necessary adjustments to make them into easily distributable and good quality PDFs.
But, as with most areas of computing, progress is swift, and I believe there is no longer any need to remove the spines of items being digitised – they can be digitised and physically preserved, which is surely a win-win.
With the removal of the need to destructively digitise print items, I believe physical preservation of items being digitised should be as high a priority as the digitisation itself.
The strength of my belief does vary (very slightly) according to the nature of the item:
I think one-off or rare items should be physically preserved during digitisation;
I think books which are known to have several or many surviving copies are potential candidates for destructive digitisation, but I still strongly prefer all copies remain physically preserved;
I think more widely disseminated items such as user group magazines are the ones I feel least strongly about – as long as there are confirmed multiple extant copies (or they can be dismantled and re-bound as mentioned above);
I think there are some items which cannot be easily digitised either way – books with large fold out leaves, for example: a per-item judgement call would need to be made by the owner of such items and/or the community the digitised copy is meant for (NB: the Scribe system does have a large image capture accessory which I think can cater to at least some of these edge cases).
The actual condition of the item itself does not enter the equation for me – while I sacrificed the “worst condition” copies of Lon Poole’s books I owned, I still deeply regret even this “lesser” sacrifice. If I only had one copy of an item which was in poor, but still bound, condition, I would only non-destructively scan it, rather than having its spine removed just to make digitisation easier.
The Internet Archive is taking the time and spending the money to digitise and physically preserve print items – that fact alone was what got me started adjusting my attitude. Seeing the non-destructive book scanner at Robos and Dinos cemented this form of digitisation as the preferred default in my mind.
When researching others’ views for this post, I found a blog post written by Internet Archive preservationist, Jason Scott (who was one of the Skype video callers during WOzFest 5¼″). Jason makes the case that something is lost when an item is physically altered for the sake of digitisation, and that really struck home with me.
After reading that post and giving it more thought, I came to realise how much binding can tell you about an object or its producers – it’s a form of physical metadata:
Did a usergroup skimp on production costs and only use one staple?
Did user groups or software publishers who staple-bound print items guillotine it after stapling to avoid pages extending past the cover (which would speak to having extra money to spend on appearances)?
Did publishers or software houses change their binding methods according to the whim of their business performance or prospects? An example would be small software or book publishing houses moving from staple-bound to perfect-bound titles as their business grew.
Did page elements extend into the inner margin, and, if so, how carefully were the elements on facing pages made to line up (which speaks to paying printers more for such alignment and “quality assurance”)?
Of course, much of this information is secondary to the goal of digitisation and dissemination of the content of these books – but we don’t know today what will interest researchers or enthusiasts in the future.
While dissemination of information is important to a vibrant retrocomputer community, I strongly believe physical preservation of items is equally important for historical context – physical preservation along with digitisation gives the widest view of the past to future enthusiasts and researchers, and, I believe, should be a goal we all strive for.
This sort of “physical metadata” is potentially lost to future researchers if “only copies” of items have had their spines removed – and it’s sometimes hard for an owner to know if a particular edition or print run survives in only one copy, while other editions may have several surviving copies.
Is my recently acquired (and prized) early copy of the First Edition Apple II User’s Guide with apples of layered colours (as opposed to other Editions having single colour apples on the cover, see below) the only extant copy with that design? It may or may not be, but I’d not seen it before, despite a 15 year interest in that title and its variants. I wouldn’t want to damage it physically only to subsequently find out it was!
Additionally, what might pass as an acceptable scan today may be found wanting in 1, 2, 5…maybe even 10 or more years. Having undamaged physical items available for rescanning with better technology in the future allows that better technology to be utilised to its fullest extent.
On this point, I’ve noticed several preservationists have revisited their earlier scanning efforts to re-scan items at higher resolution and/or to post-process them with newer tools – it will always be better to have an unaltered original for such rescanning efforts.
Another important consideration is that while scanning technologies for non-destructive digitisation will only improve, they will also continue to get cheaper – since Jason Scott wrote the above-linked post, the Scribe system has reduced in price from US$25,000 to US$13,000, just shy of a 50% price drop in three years!
Reduced cost and improved post-processing will see non-destructive digitisation be within the reach of more and more retrocomputing enthusiasts as time goes by, and I’m hoping that destructive scanning will fall by the wayside. As far as I’m concerned, this can’t happen fast enough!
Be sure to let me know your thoughts in the Comments below.
It’s annoying that a little hiccup can lead to a lengthy hiatus in my Apple ][-related projects and this blog.
I had been trying to maintain a weekly posting schedule, and also keep various tasks on my Apples moving along, but I lost access to the Man Cave for a short couple of weeks, and everything just fell by the wayside!
However, this post marks the reboot of my generally successful period of moving retro things forward, I promise!
I’ve already started planning for the next WOzFest, with an expected timing of November – I have some very particular ideas about the name and theme, and I intend to provide attendees with a very real memento of their participation! Look for the announcement over the next several weeks.
I have full access to the Man Cave again, which will allow me to finalise my disk ][ refurbishment project I began before WOzFest $04 – I’ll do a write up on that shortly, including discussing my “only make it once” ribbon alignment mistake, and how I identified and rectified the resulting damage.
I’ve decided on a surefire path to move another major project forward, which will be the topic of my next post, and hopefully several more during the month of October (hint, hint!).
All in all, I’m excited at the prospect of “getting back into it” in very tangible ways! I hope your retro projects have not been as neglected as mine have been recently.
On the software preservation side, 4am and Brutal Deluxe Software are amongst those involved in making old software, especially protected software, available for use by preserving the software, often via copy protection cracking (often detailing the cracks to allow them to be reversed or studied).
These preservation efforts often require a non-standard disk image file, such as a .edd file, made using an EDD+ card and software like Brutal Deluxe’s i’m fEDD up. These files preserve the stream of bits coming off the disk before they are decoded by the Disk ][ controller card, and this data can be captured down to a quarter-track resolution (can those in the know please correct me in the comments if I’m misrepresenting this?).
Occasionally a .nib file, which records extra track data (such as DOS volume number) beyond a standard 140K .dsk image file is enough to defeat copy protection which relied on this information.
So I’ve been thinking recently that it would be nice if we could use Apple ][ disk drives on modern computers, say via a USB-based device, to capture not only .edd files, but also .nib and .dsk files (ProDOS-ordered and DOS-ordered). Let’s call it the “Disk ][SB”.
I know I’m not alone in contemplating such a device: Apple ][ luminary Mike Willegal worked on an interface card in 2008 and 2009 with a view to having a final version utilising USB. Glenn Jones indicated on Mike Willegal’s site that he had worked on a similar device at some point in the past. Both projects are currently on hiatus.
Glenn pointed me to the Device Side Data FC5025, which connects a PC 5¼” floppy drive to modern computers via USB and is a currently active project – this is pretty close to what I’m suggesting, but can only create .dsk Apple ][ disk images, and can’t read “flippy” disks, which were not uncommon in commercial Apple ][ software, let alone in home use. Perhaps the most famous example of a flippy disk in Apple ][ circles is the original Karateka disk, which would allow you to play the game upside down if the disk was inserted upside down.
Further along the path to deep-reading of disk data is the KryoFlux, which reads the magnetic flux transition timing from disks and saves that data to modern computers. This is, perhaps, the bee’s knees of software preservation – but it’s also between €98 and €125 (plus the cost of a floppy drive), which for me is above my budget.
I envisage the Disk ][SB as operating somewhere between the KryoFlux and the FC5025 – not as low-level as magnetic flux transition timing, but higher resolution than the .dsk images the FC5025 will produce. Almost like an EDD+ card for modern computers. Having it Apple ][-specific meshes nicely with my computer model chauvinism. Perhaps the only “special” hardware required would be a physical Apple ][ disk drive.
And then, of course, there’s use in emulators. Charles Mangin, through his RetroConnector store, offers various adaptors for using legacy hardware on modern computers, and modern devices on legacy computers, such as his Joystick Shield for using Apple ][ joysticks on modern computers, including use within emulators. How cool would it be to be able to boot an emulator from a physical disk in a Disk ][?!
While contemplating such a device, my mind keeps returning to the Apple II Pi, which integrates modern hardware with ancient. On one hand, I wonder if the Apple II Pi could be utilised in some way in my grand scheme for modern disk image capture, while on another it makes me think that surely it would be possible to design a new USB-based solution for connecting Disk ][’s to modern computers (and I’m aware that’s just the certainty of ignorance passing judgement on the Disk ][SB’s feasibility – I’m no hardware or software engineer).
So, to summarise, my initial wish list for the Disk ][SB is:
allows connection of Apple Disk ][ drives (20-pin connector) to modern computers via USB;
allowing DB19-based drives to connect would be a bonus (and is SmartPort support too much to wish for?);
I’ve certainly known of Ron Wayne’s part in Apple’s founding, and how Mike Markkula not only helped Apple establish itself with finance and guidance, but also by writing some of the early Apple-branded software to help showcase the Apple ][’s capabilities when it was first released.
What I didn’t know until last week was one little snippet of information about Mike Markkula’s impact on the Apple culture which is still on display in many products and in product announcements throughout its history – the programs he wrote were published as being authored by none other than “Johnny Appleseed”.
Yep, that’s right – the perennial Apple-using chap who shows up in probably every screenshot of a phone call, contacts list, iMessage chat, or e-mail shown during Apple keynotes and product announcements, and as a dummy name programmed into many Apple products, started his association with Apple 40 years ago as the programming nom de plume of one of Apple’s early founders.
I simply cannot imagine how this fact has eluded me all these years – I suppose there’s a chance I missed it when reading or hearing about it in the past, but it really is exactly the sort of factoid I tend to remember and take note of. I can find it referenced on websites going back to at least 2010, and I’m sure it must have been mentioned or relayed somewhere before then.
I know I don’t know everything there is to know about Apple – but I obviously know at least a little less than I previously thought!
It says something of my obsession that I was thrilled to learn even this little tidbit – here’s to learning a heap more!
It takes a lot to inspire dedication – people have to care very deeply about something to be dedicated to it.
Apple ][ aficionados are by no means the only dedicated species on this planet: car enthusiasts, Raspberry Pi tinkerers, painters on art tours – we all share a degree of passion which outsiders often view askance…while being so dedicated to their own “thing” such as a football team, a TV show, or a favourite restaurant.
But today, I was reminded of just how dedicated my fellow redo-computer enthusiasts were, in a few (very) different ways.
Firstly, there was an e-mail from Ken Gagne, the Editor-In-Chief and Publisher of Juiced.GS, the world’s only remaining (and longest running) Apple ][ print magazine. [Disclaimer: Ken kindly commissioned a story for the June ’16 issue of Juiced.GS on WOzFest from me.] To continue sourcing material for and publishing Juiced.GS takes a special kind of dedication. I burned out after only a few years of co-editing the Club Mac magazine, MACinations – Ken has been Editor-In-Chief of Juiced.GS since 2006! And, it seems, he has no intention of stopping – what an effort!
Secondly, Jeremy, an attendee of WOzFest /// posted his gallery of photos from the event on the WOzFest /// Galleries post I finally got around to putting up. Jeremy drove up for WOzFest /// from Canberra, a 6-hour round trip. And Jason and Geoff flew up from Melbourne to attend – it takes a certain kind of dedication (or crazy) to go so far for what is, in effect, a one-night informal gathering of Apple ][ collectors – what an effort!
Steve had already gone above and beyond in extending his Floppy Emu beyond its original scope of being an early Macintosh floppy drive emulator. It now emulates pretty well any Apple drive designed to use the DB19 connector, including SmartPort drives, the Macintosh (non-SCSI) HD20, Apple ][ drives and Lisa drives. Steve could have quite rightly rested on his laurels and said, “No more DB19? No more Floppy Emus!”. Instead, he took the bull by the horns and solved his own supply issue and that of other enthusiasts – what an effort!
With dedication like the above, there are many exciting days ahead for us Apple ][ collectors!
You must be logged in to post a comment.