What is the small web?
In short, I think of it as the space on the web (or on other protocols) focusing on text writing like blogs, and without ads, excess images, platforms, cookies, pop-ups, videos, and the like.
The āsmall webā is a term thatās lately used to describe websites, online spaces and protocols that focus on a ātext-firstā environment. Folks that make websites with this ethos use basic semantic HTML, often write blogposts or longform writing, and no or low design. There are tens of thousands of websites of this type, and even search engines dedicated to finding them. They prioritize reading and concentration and de-emphasize applications or a platform-centered environment. These websites look good in any browser (including command line web browsers), run on old or slow computers and devices, and theoretically use less energy and bandwidth. They also eschew pop-ups, ads, trackers, clickbait, random image headers, sidebars, and other web junk. Theyāre fast and meant to get right to the point in their writing.
Example sites:
=> https://simplifier.neocities.org/ Simplifier
=> https://sjmulder.nl/en/textonly.html Text-only websites
=> https://leetusman.com My current text-only landing page
=> https://cheapskatesguide.org/ Cheapskateās Guide to Computers and the Internet
Search engines for the small web:
=> https://lieu.cblgh.org/ Lieu webring search engine
In addition to http, there is a relatively new alternative internet protocol thatās been designed specifically to favor this kind of small web mentality, and itās called Gemini protocol. Only about 4 years old now, there are perhaps 10,000 or so users/viewers/readers/writers. There are dozens of Gemini clients (what would be called a web browser for the www web is called a āclientā here). I use Elaho on iOS, or LaGrange or Amfora on my laptop.
Gemini is a new internet technology supporting an electronic library of interconnected text documents. Thatās not a new idea, but itās not old fashioned either. Itās timeless, and deserves tools which treat it as a first class concept, not a vestigial corner case. Gemini isnāt about innovation or disruption, itās about providing some respite for those who feel the internet has been disrupted enough already. Weāre not out to change the world or destroy other technologies. We are out to build a lightweight online space where documents are just documents, in the interests of every readerās privacy, attention and bandwidth. āfrom Geminiprotocol.net
=> https://geminiprotocol.net/docs/faq.gmi Project Gemini FAQ
=> https://en.wikipedia.org/wiki/Gemini_(protocol) Gemini (protocol) on Wikipedia
=> https://gmi.skyjake.fi/lagrange/ Lagrange client
=> https://github.com/kr1sp1n/awesome-gemini Awesome-gemini list
The following gemini capsules (aka sites) must be opened in a Gemini client, or if you want to use a proxy that will work on the web, try entering them on the Smolnet Portal](https://portal.mozz.us/gemini/mozz.us/) proxy. On that page, enter the URL in the input box and click Go.
=> gemini://warmedal.se/~antenna/ Antenna feed aggregrator
=> gemini://gemi.dev/cgi-bin/waffle.cgi NewsWaffle
=> gemini://bbs.geminispace.org/ Geminispace BBS
=> gemini://skyjake.fi/~Cosmos/recent.gmi Cosmos
An interesting facet of gemini is that you can include images, audio and video, but theyāll never be shown in-line. In other words, to see an image, you have to click on its link, which in most clients will download to your computer and open locally. In Lagrange, it will open in-line after you click on it.
There are a large number of experiments on Gemini. I really like Antenna (linked above) as a way to see a bunch of different blogs recently updated, with (ideally) clearly titles to help indicate what writing Iāll find. You can also use a feature of Antenna to customize the feed.
For a similar experience on the web, one can use a RSS reader and subscribe to many blogs and sites.
The small web dovetails nicely with concepts of permacomputing and continuing to use technology and computers long after their expected expiration date. These websites and gemini logs (and more especially Gopher, one of the influences of Gemini) work well on underpowered machines and some ancient ones, yet theyāre fast, and they offer a great reading experience. You can try it and see what you think.
This particular application Iām covering is not necessarily the āsmall webā but I still think is worth writing about because it points to a shared experience for a small group of people, which I increasingly think is a useful way to try to use the internet.
Photo.exe is made by Rustling Broccoli. This app is a simple and free dithering photo taking program. When you take a photo and tweak it, or import an image, you can choose to save or export to their single public feed. Everyone contributes to this same anonymous feed, and because of the dithering, thereās a shared aesthetic that makes it feel like everyoneās part of a tiny friend group. āJoin our un-social network,ā they say. Itās a low volume stream, like a peculiar alternative to instagram where you didnāt choose your friends or followers, and all images are lofi dithered. Still, itās a nice view into other peoplesā lives. The images are compelling. While the stream is public, the URL isnāt public facing, and can only be found by clicking in the app to link to an external page. I think Iāll honor that by not linking to it here but directly to their application website. Youāll have to download the app to try it.
=> https://photo.breq.net/ PHOTO.EXE app website
I like this idea of a shared small almost-community. I donāt think it would work well if hundreds or thousands were using it, but for the few dozen of us on it, with about one post every few days, I think itās a nice space.
This is also similar to how I use Mastodon, where Iām trying to maintain a friend group of no more than about 150 people (Dunbarās number). It makes for an intimate space, but one that still feels like it has enough variety to keep things fresh. If I check once or a few times a day, thatās fine, but if I check in and read or post once every few days, or even less, thatās fine too.
]]>My decade-old Raspberry Pi 1B
I was particularly interested in trying out this challenge because Iāve gotten more interested in permacomputing and resisting consumption of technology, especially of computing devices. On some of the online new media and technology educators groups Iām a part of I feel outside the norm. Rather than chasing higher resolutions, more ālatest technologyā, and ever-more-expensive black box tools I like to teach long-running tried-and-true technology and tools, especially how we can customize, own, understand and break them down. This, along with being able to critique and evaluate our own place in technology (and its ties to capitalism and military technologies) feels important to me.
In the past couple years a disparate group of folks online have been discussing a nascent movement on permacomputing.
Permacomputing is both a concept and a community of practice oriented around issues of resilience and regenerativity in computer and network technology inspired by permaculture.
In a time where computing epitomizes industrial waste, permacomputing encourages the maximizing of hardware lifespans, minimizing energy use and focussing on the use of already available computational resources. We do this {because} we want to find out how we can practice good relations with the Earth by learning from ecological systems to leverage and re-center existing technologies and practices. We are also interested in investigating what a permacomputing way of life could be, and what sort of transformative computational culture and aesthetics it could bring forward. āpermacomputing wiki.
To participate in the Old Computer Challenge I had to figure out what computer to use, and I didnāt want to purchase another used computer just to participate. At my home I have both an old Chip computer and Raspberry Pi computer. I had some issues getting the Chip to run properly, so I found an Old Raspberry Pi 1B in a shoebox, an old computer I had purchased in 2013 as I was first getting deeper into programming and open source software and hardware.
Iāve written previously on working to create digital archives of DIY and artist-run communities. One interest of mine is to do create a useable ecosystem of tools intended for longterm preservation of digital assets like texts, images, videos, websites and the like. Being able to create long-running computers, even old computers, is another goal of mine as an attempt at making resilient machines that could continue to work for several decades. One way to do this is to work with long-running command line software.
The command line has long been supplanted by GUIs by most people, save programmers and techies. Those of us that use the the command line enjoy its speed, consistency, and ability to automate or script repetitive computational tasks. Importantly, while GUI software constantly changes due to the whims of style, audience, and complications of the graphics pipeline, much command line software works the same month after month, year after year, sometimes for many decades. And it works fairly consistently on every machine.
A papershoot camera shot of the Raspberry Pi computer display on the side of my desk in front of a wall and bookshelf. The display shows the command line, a photograph of my bookshelf, and the Lagrange Gemini protocol client.
I set up Void Linux, an independent distribution of the Linux operating system, and since I run this on my personal laptop and studio machine, I found it fairly straightforward. For a couple days I ran just the command line without installing a graphical system. On the command line I could use text editors to write this blog, play text games, even read the internet, Gemini, or check the weather all using text-based software.
This all seemed to work fine and not much slower than my much faster and relatively new (from 2019) i7 laptop. Even startup of this old underpowered computer went fast enough, maybe 10 seconds from power to the command line. I thought about basically just continuing the command line for the week.
I continued to read the news each day, and countless blogs and gemini logs (glogs!). But a day or two later I decided to do the full GUI setup. I wanted to use a full web browser and look at images. I installed a window manager (i3) and the X windows system to see what thatās like. It lets me have multiple workspaces and splits my screen to have multiple programs running at once. It shouldnāt have been surprising, but I immediately noticed a slower system. And in fact things started to become annoyingly slow. I didnāt even bother with installing Chromium (the open source Chrome fork) or Firefox as I knew it would be too painfully slow, so I installed Dillo and Netsurf as they are lightweight GUI web browsers, and Lagrange as my GUI gemini client.
Did I cheat? You bet. Especially at the beginning of the week, I was checking my work email and doing professional work on my regular laptop. Later in the week I experimented with doing some of my research work on the Pi, which went okay, though slowly. But there was an upside to that. I found that due to the time it took to look things up, it was harder to goof off on the Pi, and so I avoided social media except for approximately once-daily check-ins to my Mastodon server community.
The other elephant in the room: I continued to use my phone this week for communication, maps, instagram. I think it probably would have been better to take a social media and phone fast for the week, but I was too wedded to communicating quickly with friends and family. Iām also in an exhibit, and I performed a live concert during the week, so I also felt the need to promote these on instagram. Could I have done without the self-promotion? Perhaps I should have.
For entertainment this week I wanted to play some games. In addition to the text-based dungeon crawler Rogue that Iāve been playing for years I downloaded Pico-8, the raspberry pi edition. Iām planning on teaching a class in making games in Lua, probably with Pico-8, next spring, so I thought this could be a fun environment to test. At first I couldnāt get it to run but with some search and forum discussion online I found that I could decipher what libraries were missing, and I downloaded and installed those directly via Raspberry Piās firmware github repo.
I launched Pico-8, smiling at my success. I have made a number of Pico-8 games, and played many, for the past 7 years. On launch, Pico-8 warned me that it would run at less than 30 frames per second but I persisted. I launched splore, the games browser, and instantly had access to thousands and thousands of games made by people around the world. I tried a 3d game, and it was quite slow, but still manageable. I forgot to test Poom, the Pico-8 clone of Doom, but my guess is that it would be too slow to be fun and that I should stick with puzzle games.
I next tried to watch a movie. I used the command line software yt-dlp to download french science fiction movie featurette La Jetee by Chris Marker. Since itās subtitled and told in a series of still images with a soundtrack, I figured that even if the video couldnāt play at a reasonable framerate Iād still be able to watch. Alas, it was too slow and I gave up in frustration.
I was able to program on the computer, using my favored command line coding program Vim/Neovim, but Iām fairly certain GUI software like Geany could have worked as well.
Iām still talking about the command line now: I also downloaded bsd-games, a collection of many games like Backgammon, Adventure, Boggle, cribbage, hangman, mille borne, snake, tetris, and a few dozen more games. I didnāt try all of them but itās nice to know theyāre available. When I teach Social Software this coming semester Iāll be adding these games to our shared server.
For āconsumingā image and video content such as social media, the Raspberry Pi 1B is not a great system. For even my professional work, Iād need to be a bit careful. The system occasionally quit all my programs and dropped me back at the login screen. This happened about once an hour or so, always when I was multitasking with multiple workspaces. I should have resisted multi-tasking to prevent this! Thankfully my text editor has the swap backup system that auto-saves everything written, but it was an annoying hiccup when it happened.
In terms of reading, writing, being entertained - this system worked for me with the caveat that I sometimes just needed to be patient when doing anything with graphics. Void Linux is a great Linux distribution, and itās incredible that a volunteer team maintains the operating system and repositories of software that can work for fast new computers and old minimal ARM computers as well. Contrast this with the billion-dollar Apple computer. Thereās no way theyāll individually respond to you on a forum online or in a chat room to help you fix something on a decade-old computer.
It was also nice to be using these old computers, writing on them and posting to the internet, reading about other folks experiences with their own Old Computer Challenge. It helps to be working in community.
As the week using this computer ended Iām heartened to know that this 10-year-old tiny pocketable computer can still browse websites Iāve made, is still a useful computer with tools that feel timeless to me.
Going forward one of my next projects is to work on some simple command line and file browser tool for viewing collections of images and text in the Archiving Artist-Run Spaces project. Iād love to have a system that can be more resilient, to resist brittleness, by falling back on basic command line programs and the OSās GUI file system defaults to do the lifting, essentially coding via gluing together basic long-running Linux software. Iām looking forward to testing this out in the weeks ahead.
In short, I use the command line because itās fun, itās logical, expressive, and because it lets me avoid planned obsolescence and arbitrary restrictions. These are not things everyone cares about, but theyāre things I as an artist working with code care about! So read on for the details:
I use the command line to resist software and hardware planned obsolescence. Itās almost the opposite of an iOS app. I own (this is embarassing) 3 iPads, my parentsā from 2011, my own from 2017, and one from school from 2021. The 2011 canāt run much of anything. The 2017 worked fine but Apple ended support. They wonāt let me update the system and wonāt let me download any apps from the app store because the system isnāt up to date, a catch-22. On the other hand, the command line still works on these. It wonāt disappear one day. It wonāt be made obsolete. I can still run my command line software on these.
And following this: the command line and programs for it are not an āapp storeā and arenāt controlled by a single entity or company. Itās controlled by me. It can run anything I want.
I use the command line because it works the same or similarly across computers, operating systems, tablets and phones.
I use the command line because it works on practically anything: low power, ancient hardware work fine. I can still browse the web, read and post to social media, read books, keep a to-do list or spreadsheet, or anything else Iām used to do doing. As we grapple with climate change and how our technology choices and consumption impact the earth itās helpful to use tools that work just as well on our older but still fine machines.
I use the command line because it can be automated. I use this to resize directories of photos in a split second, rather than individually opening and tediously resizing in something like Photoshop. I can backup updated files on my hard drive ultra-quickly. These are just two among many examples.
I use the command line because I can fit different programs together to meet my needs in a way that GUI software canāt or wonāt. Why pay a niche internet service that may or may not provide what I need when just a program or two or three can pull it together easily in a minute.
I use the command line because itās fun to learn, and not so difficult. With a dozen or so commands you can get started and do a lot. You can quickly learn how to piece them together, and how to get help, and intuit how additional programs will work.
I use the command line because command line games are addicting. I have played (and not yet beat) Rogue and Brogue for almost a decade. Backgammon and Chess in the command line are fun and clear. Colossal Cave Adventure is a fun retro experience I particularly enjoy showing my students.
I use the command line because itās just plain convenient and easy, faster than the GUI, and just lets me get on with my regular computer needs.
]]>Lately a number of folks in my online community have been discussing the problems of the platform Discord, from issues of centralization, monetization of community conversation and labor, accessibility issues, to the inability to archive or save discussions or content. Some folks have gone back to using a combo of blogs and the old chat system IRC. In my own communities Iāve cut back in my participation in Discord and created alternative self-hosted forums or piggy-backed on Mastodon instead.
Inspired by a CLI asciicinema recording by software and hardware developer Phil Hagelberg on how to use IRC I started to think more seriously about the idea that many cloud services could be replaced by some Linux software or some lines of Bash code gluing programs together.
First, I want to address some straightforward question or critique someone might have. Namely, why use the (Linux) command line instead of a simple web platform? Another criticism of this approarch might be that using the command line could appear to just be retro nostalgia, or unnecessarily complicated.
To answer these: I think using tools that donāt cost much or any money, that we or others can modify and share, and that we can combine together to meet our needs is empowering. I have a bicycle where I can fix a flat tire, replace the chain, and do minor maintenance like adjust the breaks. Sometimes I spend money at the bike shop such as when I needed generator lights installed and wasnāt confident in my own work. The bike is what I use to commute to my studio, and how I get around town. And yet I feel okay working on it, and the knowledge gained from trying lets me step in and fix something when I need to. Though Iām not afraid to get extra help at a bike shop or from knowledgeable friends. This isnāt a perfect analogy, but itāll do.
Likewise, working with computer tools and software doesnāt need to be intimidating. You can learn a bit at a time, try things out, find tutorials, and look for community to help you along the way. Itās also a good way to resist commercialization. Rather than buying āproductsā and the need for incessant upgrading, annual subscriptions or throwing out old products to get access to the latest products, you can opt out. Like riding a bike command line and text user interface software can be beautiful, elegant, luxurious or just minimally works. With some basic Bash knowledge you can get pretty far.
Iāll admit that some Free, Libre and Open Source Software can be ugly or clunky. I am also sympathetic to resisting pure retro-nostalgia, but I donāt think continuing to use the command line in 2023 (or whatever year you are reading this) is simple nostalgia. Bash and the shell predates GUI software, and while I wonāt make a definitive prediction, it could even potentially outlast it. The shell never disappeared, and the amount of command line software has increased exponentially in the past number of years. And most of it continues chugging along to work year after year. Itās not that uncommon to be reading a command line software man(ual) page and it lists the year in the 80s! And itās still useful.
On Linux, our basic automation tool is the command shell. As opposed to cloud software and GUI software itās often much easier to develop and certainly to glue together command line software. The term āglueā here means to combine command line software together in various ways, sometimes envisioned by the developer but othertimes not. The software will have ways to take input, to be modified or configured, and has standard ways to produce output. All this allows it to be composed together with other command line software. We still donāt have a great way to āglue togetherā GUI software in this intuitive way that we weave together software in Bash. Using Bash to glue tools together is such a fundamental advantage on Linux systems because it was intentionally built in from the beginning to Unix.
Glue code is often considered to be a form of āduct tape programming.ā Itās fast. And glue code solutions might be considered a āhackā approach, not necessarily implying the original term hacker here. And while this could imply that glue code doesnāt last long it could also be thought advantageously as well. For the same reason that cloud services and platforms are black boxes where you input money or privacy/data and get out a simplified output or software product, Bash and other glue codes let you pick and choose, customize, see its innards, test your own idea by typing it and running it in the command line REPL. You continue to refine it until you find a solution, and then you can automate your solution, with scripts, cron, and the like.
Without further adieu, I present some sketches of ideas, recipes and speculative ideas on how to glue together your own alternatives to FAANG and other startup products in the command line. Some of these are easy peasy for those new to the CLI. Others will require a bit more knowledge and experimentation.
Some of these will run on your own computer. For others youāll want to access a server, either because you need to āsyncā with someone(s) else, to store info remotely that can be accessed by multiple people or other reasons. You could join a tilde community or you canā¦
For many of these solutions, if you have your own server, either a spare old laptop, raspberry pi, or a remote server, you wonāt be as reliant on cloud services.
For an easier-to-configure server for this kind of thing, try yunohost.
Setting up a server is beyond the scope of this article, but there is lots of documentation online if you do a search, or read the yunohost website.
Nano and WordGrinder are simple and attractive text user interface-controlled word processors.
Alternatively, text editors such as Emacs, Vim, Neovim are old faithfuls.
Use ttyshare or tmate or even tmux with ssh to create a shared terminal session. Then both open your text editors.
Alternatively, use Vimās server capability.
Vim: enable server capabiliity
Git or Subversion plus a shared remote server, perhaps one you set up with yunohost.
You can use Gitea.
Or even simpler, bare git on a server, see:
Idiomdrottningās How to host git repos on their Gemlog.
Put a spreadsheet on a server that others have access to editing. Or use email. Or host a form on a server using cgi. Have people select the best option. Schedule an email to you or all at the end date with the results. You could set it up with cron or just mark on your calendar to check back on a certain day.
I save my todo list as a textfile. Thereās also todo.txt project.
Pyradio is excellent.
Imagemagick is incredible. Lots of recipes are available online.
I use chafa to browse images in the command line. If Iām going through directories of images I use fff, a vim-like file manager. Pressing i over an image file will open it inline overlaid in the terminal.
The terminal emulator Terminology can also show images inline in the terminal. For example tyls is like the linux ls command setup to display images of all files.
For browsing the web in the command line there are a number of great programs but w3m has the w3m-img plugin that renders image in the command line. Add the -H flag to get āhigh qualityā images.
links text browser has the -g flag that enables graphics mode in the command line.
I have a simple bash script that generates html image galleries that I host on my web server. I use imagemagick to resize.
cyclenerd has an example called gallery.sh that automates this.
Alternatively you could use nextcloud to share images. I havenāt tried yet so youāll have to explore on your own.
If you use mastodon, try toot.
twtxt is a minimalist social media protocol like a minimal distributed twitter (sorry for the birdsite comparison!). You can use a web client or browse and post in the command line.
Looking for a dropbox or wetransfer alternative? Try The Null Pointer
Or alternatively, upload to your own server. For sharing files on your own server, use scp.
How to use scp to securely transfer files
For backups, rsync.
On Ctrl-c club tilde we use the iris command line forum software with hundreds of users on a single shared server. We love it.
This is a fun category. You can check the weather a few ways.
curl wttr.in
There are some other options you can pass in too.
Or use ansiweather.
Many of these solutions, systems and recipes are barely more complex than using a cloud service alternative. And they wonāt be mining your data, selling your info to advertisers, or trying to sell you additional services. With a little bit of elbow grease they can be put to good use. Help is often a search engine query away, or why not try posting on IRC? Many of these programs can be tailored to your own use-case. These programs are almost always free and some take donations. Beyond initial setup (if any), these programs also eschew all of the advertisements, pop-ups alerts and notifications and other cruft that contribute to mental exhaustion while using some of the commercial platforms these programs are replacing.
This also doesnāt need to be an all-or-nothing affair. Pick and choose what works for you. Thatās the beauty of having access to free and open source software.
I hope you find some solutions to your own needs, and glue together your own software ecosystem.
]]>Archiving Artist-Run Spaces was created as a resource for experimental artist communities, both physical spaces as well as online-only creator communities, to build archives of their activities.
While I consider the site to still be evolving, with new resources and additional archives to be added to the listing page, I think the site is already useful and elegant and ready to meet the world.
It is based on my two decades of experience working in DIY spaces and artist-run communities and identifying unmet needs in the space. Just as these spaces attempt to carve out their own place in the world and make their own home and culture, there is a mismatch with their values when artist-run communities use commercial platforms for presenting or hosting their work. Just as an art space could be forced to close due to rising rents, many platforms have come and gone over the years, changed their business models, or sites or passwords have been misplaced. Recognizing that many exhibits, images, articles, web-based projects, and documentation disappears due to link rot this siteās goal is to help artist-run communities think through the value of what theyāre engaged in and how they may want to strategically preserve some aspect of themselves online. This project isnāt meant to replace the needs for professional archivists but rather to provide possibilities in the DIY spirit of these communities who often rely on the strength of their own networks.
Sharing resources is an important goal for the site
This site was made with homemade, simple and open source tools with an eye toward creating an attractive website minimizing bandwidth and maximizing readability and longevity of the site. The pages are written in markdown, a lightweight language that is mostly just plaintext with some formatting marks, so it can be scanned and read itself, or rendered to a static HTML page. You are either reading this as the static website or as a markdown text file right now.
The style of the page is meant to reflect these values. No external fonts are loaded. Visitors will see default typefaces. The logo is a system emoji. There is no tracking on the website, no ads and no cookies. As far as one can know, no Google, Facebook, or Amazon resources have been used.
The archive page is a simple grid, built with CSS grid and a nod to the plain, simple, elegant design of brutalist websites
The style reflects maximizing text and making it easy to read and scan a page with your eyes or with page scanners. A palette of 8 colors was selected from lospec.com.
No Javascript is required to view or navigate the site. Media queries are used to resize the font for readability on small screens. Where possible, images have been compressed to support minimizing page load.
Pages are written in markdown and converted to html with a simple homemade static generator based on the free libre program Pandoc.
Other sites whose design inspired the design of this site:
Brutalist Websites, Text-only NPR, Solidarity Infrastructure Interviews, Low Tech Magazineās Low-tech Website, Buttondown archived posts, Compudanzas, Cerca forum,
The site was made by Lee Tusman. Archiving Artist-Run Spaces is a project by Lee Tusman and its design and content are shared under the Peer Production License.
Caleb Stone designed and did programming for Experimental Archive Space and Gas. Andrew LeClair provided design support for Gas. Amelia Marzec helped debug Gas. Peter Erickson co-organized the Little Berlin archive.
Thanks to Ceci Moss for directing Gas and working with us to build the archive.
Thanks to the many photographers, writers, videographers documenting.
Thanks to the many artists of these special spaces, including everyone at Space 1026, Babycastles and Flux Factory. Thanks to Merveilles for ongoing web community. Eli Mellen graciously gave helpful advice.
The copyright guide was created by Batya Kemper and Lindsay Harris at NYUās Technology Law & Policy Clinic.
Color palette Sea Sorcerer by Noah Rowlands.
Site architecture and header and footer design adapted from Compudanzas.
]]>This is the 3rd or 4th HTML Energy meetup Iāve participated in, and I enjoy it because of its community-building friendliness. It feels a bit like working in a library with folks around you concentrating on similar activities. Itās good āenergy.ā Thereās a sharing component where folks bring snacks and also share our completed sites at the end of the session. It feels beginner-friendly as well. Thereās no instruction per se, just a time for people to jam on web 1.0 energy.
Gathering around to see what everyone created
This Saturday we met at Valentino Park in Red Hook. I met a lot of new folks and enjoyed sitting next to the rolling East River.
There was a series of printed out āpromptsā to guide creation, but we were also encouraged to pick our own topic or theme, or to take inspiration from the scene around us. I chose that route, and decided to make a site dedicated to that moment and space. We were waiting out a storm to pass by, and the river was splashing a bit more aggressively because of it. There was a cool breeze, scattered conversations, passersby, and the occasional roving band of kids or a party nearby on the riverbank. And we were sitting by a number of Red Hook warehouses, in varying states of re-use and decay, with graffiti and murals painted on them.
After spending approximately an hour coding the group of 23 of us paused and placed our computers on blankets and walked around and looked at what each other made.
An array of computers, notebooks and devices showing off our screens
I didnāt bring a laptop this time (or any previous time!) so I actually wrote my code with pen in my travelerās notebook first. This is a bit of a challenge because I also wrote CSS and javascript code to make the site dynamic, which maybe goes a bit against the html ethos? After that I used the command line on my phone (shelly) and sshāed into a server where I coded in vim. Since I wanted to take photos and then include those as well, I couldnāt figure out the easiest method to do that. But someone suggested using social media or imgur. I ended up logging into glitch.com and uploading images there, grabbing the URL and then pasting them with vim into my html page. A few other folks coded on their phone or tiny alternative computing devices.
You can check out the site project page here:
=> project info
And a direct link to the project running in a browser:
=> redhook
I took photos of the scene around us with Bitcam, a dithered aesthetic, and manipulated these with Imagemagick on the command line. I attempted to make a simple site of images, overlapping, overlaid but with some motion and randomness to allow the image to change dynamically. This was a challenge to code in javascript on paper but I tested and fixed it once I typed it in. So this projectās website output reflects the conditions that it was made in. Maybe it gives a small feel for the calmness mixed with the energy of the folks around me and then the feeling that it could start storming any moment!
A screenshot of the redhook site I built. The final site is dynamic.
Coding on a blanket in a park with friends next to a river: recommended!
]]>For the past many months Iāve been working on building a digital archive for Gas gallery, a mobile, autonomous, experimental and networked platform for contemporary art located in a truck, parked at various sites around Los Angeles. Gas was founded and directed by Ceci Moss, and Ceci was an ideal partner to work with. Over the past two years Iāve been working on building out systems of support for artist-run spaces to engage in archiving projects, so this was a perfect collaboration.
The shows at Gas were adventurously multidisciplinary and experiential, and every aspect ā the truck gallery, web projects, performances, publications, editions ā played an equally important role in presentation. The spaceās inherently itinerant and fluid format allowed considerable independence and creative freedom in terms of concept, site, format, audience, and engagement.
The site presents the full history of Gas, including artwork, artists, zines, artist editions, event documentation and writing from the director. Caleb Stone was the site designer, and created a gorgeous design. Andrew LeClair provided technical assistance. And a special thank you to Amelia Marzec who helped Caleb and I debug a difficult cross-browser issue.
For those interested in the āstack,ā the site is built in Kirby, which allows us to run a browser CMS but render the site out statically. We made this choice in order to give more weight to site longevity and resiliency while also allowing us to build a backend database of site content. There are many hundreds of image files, each rendered to multiple sizes, and each with their own metadata files.
Now that the site is released, Iād love to hear what you think. Thanks to all of the artists whose work is included, to Ceci for working with us, and to Caleb, and Andrew (and Amelia).
The Gas gallery archive is part of a larger project on Archiving Artist-Run Spaces. For a previous archive Iāve worked on, also with Caleb, check out Experimental Archive Space for Space 1026.
Iām working on two workshops relating to archiving artist-run spaces, to teach techniques, tools and ideas in this area, and as a run-up to publishing a website on this topic. This May Iāll be presenting a workshop on this topic at ISEA conference, the International Symposium on Electronic Art, in Paris.
In June Iāll also be presenting a workshop at the HASTAC 2023 conference at Pratt College in Brooklyn. HASTAC = Humanities, Arts, Science, and Technology Alliance and Collaboratory. At Pratt Iāll also be presenting an exhibit of a number of archives Iāve been building.
]]>Recently at the request of a student I helped secure her a laptop that she could use in my intro computer science class as she was looking to get into the major and didnāt have a functional computer at home, just a phone. My school does laptop loans, locked-down Macs, but we decided to build up a Linux machine she could use as well, and that would let her explore the operating system and write her own software, which is much easier on Linux. Since I use Linux myself, I thought Iād write a little about it.
I asked friends for suggestions of where to get a near-free functional computer, and I received a suggestion to look in local thrift stores for old Chromebooks. Unfortunately, thatās not really a viable option in NYC. There are few thrift stores around, and those that do generally do not sell electronics, much less old Chromebooks.
But Craigslist is thriving here, and I saw a variety of cheap Chromebooks for sale. Would they be powerful enough to run Processing? By default the ChromeOS is not really able to run programs to write your own software. Itās just a minimal cloud-based computer attached to Google services. I looked online. It looked like it was very possible to boot up an alternative Linux OS, particularly a Debian-based distro, but it was not definitive (For the nerds: technically ChromeOS is a stripped down Linux system). There were some listed options for using a chroot to run Ubuntu, but it appeared slow and everything was years old. I wasnāt sure the current state. I also saw a variety of difficult options for installing Ubuntu (or another *buntu on top of ChromeOS).
With a little searching I wasnāt filled with confidence as it was immediately clear you canāt just flash an ISO to a USB, launch an installer and call it a day. No. For this reason, I decided to install it myself in case I got stuck for hours or days in the process.
I found someone on Craigslist looking to part with a near-new Chromebook. I met up with them at Grand Central, and ended up trading $40 for the laptop. Thatās a bargain!
I read a dozen websites and fora trying to come up with a plan. It looks like years ago it was pretty easy to flash Ubuntu and there were many options. Now Google has really locked down the computer. The main approach is to:
Start up the Chromebook and connect to WIFI.
I have a Samsung Chromebook 3 which comes withe Braswell firmware. When I looked it up, I read it must be updated.
I rack my brain and decide I will try to run Linux off an external drive and see if thatās viable.
Procedure:
Details:
Press Escape + Refresh key (3 buttons to the right of escape) + Power button. Let go of power and itāll start up in Recovery Mode.
When it starts up it tells me to insert a USB. Nah, not yet. Press Ctrl-D to turn off verified boot. Wait 10 seconds and when prompted I press enter. Itāll reboot and asks me to hit enter again to confirm. I make sure it says OS Verification is OFF. Press Ctrl-D again. It begins reflashing my drive and booting into developer mode. It takes about 5 minutes.
Eventually it starts back up and I see the traditional welcome screen on ChromeOS. I skipped signing in and clicked Sign in as Guest in the bottom left so I donāt need to make a Google account.
When I finally get into the desktop environment I open the Terminal with Ctrl-Alt-T.
I type shell and enter to get a Linux command line.
I switch to bash with root privileges to make changes to booting by typing sudo bash and hitting enter.
I get the typical warning about sudo āwith great power comes great responsibilityā mumbo jumbo. I type enable_dev_usb_boot and enter. A success message appears.
I open the terminal with Ctrl-Alt-F2 (right arrow). I login to āchronosā. No password. I type sudo crossystem dev_boot_legacy=1 and then my prompt. First weird thing occurs. The prompt tells me instead to try dev_boot_altfw instead. Hmm. I run sudo crossystem dev_boot_altfw=1 and get no error so maybe this is alternative firmware and it worked?
Iām on a Samsung Chromebook 3 which runs Intel Braswell, so I saw it needed its firmware updated. I ran the update firmware automation script from mrchromebook.tech.
cd; curl -LO mrchromebox.tech/firmware-util.sh
sudo install -Dt /usr/local/bin -m 755 firmware-util.sh
sudo firmware-util.sh
This brings up a menu.
I tried to select 2 Install/Update UEFI Firmware but I couldnāt because WriteProtect was enabled since I wasnāt able to take off the back of the machine to remove a screw. But I could do 1) and update the current Legacy BIOS firmware.
I downloaded a GalliumOS iso even though itās not maintained anymore. I had been thinking Iād download Lubuntu but against my better judgement I decided to try Gallium since it was built specifically for Chromebook. I used Balena Etcher to write the iso to USB.
Once written to disk, I plugged in the USB drive to the Chromebook and start up in recovery mode by pressing Escape + Refresh + Power to start up.
Press Ctrl-U to start from the USB. Or was it Ctrl-L?
It starts up and I run the installer, which is self-explanatory like picking keyboard, city and the like. It takes just a few minutes. I shut down and remove the USB installer. Fingers crossedā¦.
It works! Each time the computer is turned on we need to run Ctrl-L to escape the regular startup process and boot into Gallium.
Once itās started, I have a really nice looking desktop and the machine (despite being pretty low power) is fast! I opened up the terminal and sudo apt updated and sudo apt upgraded. Then I tried out the package manager App Grid Software Center and installed LibreOffice, Solitaire, Tetris and some other software for my student. There was already a variety of basic apps for playing music, displaying images, file manager, etc. I visited the Processing website and downloaded the latest version, 4.0.1 and downloaded it. I used tar to uncompress and then ran the bash installer file. This installed Processing to the desktop and placed its icon in the launcher menu under Development. I wrote a basic Processing Hello World program, tested it ran, and saved it.
Lastly, I downloaded a picture of my college and set it as the desktop background. Before class I presented the computer to my student and she was excited to get it. She began using it that day.
I hope it will be useful for her classes, browsing the web, doing homework and practicing coding in Processing.
There are a variety of other Ubuntu-derived free Linux distros.
In the summer I was an artist in resident at a museum. Thereās a drawer with old tech gear, including mixers, speakers and cables. I set up a temp recording studio to work on electroacoustic music with a friend. In a drawer we also found an old computer, a HP Envy, I think from 2010. It has Windows on it, but I had to start up in Safe mode to even try it out because I didnāt know the admin password, and the computer didnāt have āservice pack 1ā installed which prevented the browser from being updated! When I finally got the darn thing running I saw it had Windows 7. Okay! So my next step: wipe the computer and put a new Linux OS on it. But which one?
Since the laptop I found is 13 years old, I needed a lightweight distro. I wanted the computer useable by friends here without any previous Linux experience, and I didnāt think a minimal tiling window manager like i3 would be a good experience for them. So I thought I remembered that Lubuntu was the lightweight Ubuntu - so should I try that?
The projectās goal is to provide a lightweight yet functional Linux distribution based on a rock-solid Ubuntu base. Lubuntu provides a simple but modern and powerful graphical user interface, and comes with a wide variety of applications so you can browse, email, chat, play, and be productive. Lubuntu was formerly a distribution for low-end hardware, but we have refocused.
I downloaded the iso, wrote it to usb with dd, plugged it into the laptop and restarted. In a minute it was up and running. I double clicked the installer and about 10 minutes later I was up and running. I had never used Lubuntu before, which comes with Openbox as its window manager, which I donāt think Iāve used before. But it was obvious what to do: Open the menu in the bottom left. I found the menu categories and built-in basic apps pretty usable. I found the qt-terminal, sudo apt updated and sudo apt upgraded, quickly downloaded some basic programs (neovim, curl, w3m, tldr, kate, kitty, amfora, canāt remember what else) and some basic art/music programs (krita, audacity, puredata, rhythmbox, love2d).
For a 13 year old laptop, the thing is fast. Installing with aptitude is almost as quick as my much more modern laptop. This old clunky computer practically flies. Because Iāve used Ubuntu so long I didnāt feel like I needed to learn anything. I have two complaints so far that I havenāt solved: itās not obvious to me how to pin programs to the built-in taskbar. And I canāt seem to get the keyboard keys for special characters on the Danish keyboard to perfectly match the US keyboard set, but itās close enough. But those are the only issues Iāve had so far. Sound works great. The audio interface I plugged in seemed to work fine. I recorded an album of field recordings using a handheld memo voice recorder, then mixed on the computer. And everything looks pretty nice. Iād recommend a similar setup in a heartbeat!
]]>In a previous post I wrote about finding a 13-yr old not functional windows laptop and throwing Lubuntu on there and having a ānewā to me snappy computer. In this post Iām writing about re-using ātrash.ā
The dumpster is a generative space for both discarding past images or objects and finding new ones, and new works can be constructed using this detritus.
This is a sentence pulled from a description of an online artspace I built for fellow artists and members of my art collective. I havenāt really seen online trash space built into networked spaces or communities previously, but itās so fundamental to the way I work that it seemed like a necessary thing to include when I build online community: a shared space for discard and re-use of materials.
Right now Iām (IRL) in northern europe with my collective, visiting a museum to do a series of projects. Weāve been here before and are fairly familiar with the town weāre in. One of our favorite places to go is the Reuse Center which is open noon to 1700 a few days a week. We get chairs, dolls, balls, wood, bottles, old electronics, wheels, platformsā¦.these are just off the top of my head.
In my home city of NYC there is a space called Materials for the Arts. Itās a large multi-story warehouse operated by the city that collects discarded items from city agencies. From my experience itās mostly art teachers that visit to get supplies to use. But the members of my collective (officially a non-profit) usually send 1 or 2 folks a month to go to gather materials for our upcoming exhibits. Things weāve gotten there in the past: bucket paint, canvas, old violins, a hot dog vending cart, sandwich board signs, stereos and speakers, phone cases, nails, wood panellingā¦.this is only a small sampling from my memory.
Iād say the vast majority of exhibits weāve presented featured at least some materials from these spaces, used to build out artworks, the physical gallery infrastructure, for use in performances, and more. In addition, as we move around the city weāll text when we find good usable items on the street discarded, or in dumpsters.
Of course re-using consumerist excess helps reduce new consumption. A lesser but still valid benefit is that when you use primarily discarded or waste materials it helps provide a useful constraint around your activities, āartworkā, or other projects, or serves as a starting point for deciding what youāll make next, either as a meal (if itās food), or as artwork or for performances, if itās materials.
The main useable tools for collecting materials for re-use are so simple: a large bag, and sometimes friends to help you move huge things, a bike or other transport. Gloves are nice to have for rough materials. And some previous experience working with materials so you can brainstorm new ones when you see raw materials is helpful. I think itās important to leave behind materials that you donāt have a vision for re-use. It doesnāt make sense to just move trash along that will need to be dealt with elsewhere. In my city there are Buy Nothing groups, āCurb alertsā people post on Craigslist, and even explicit āFREE, worksā -type hand-made signs that people tape to things left on the street.
Some cities have these kinds of re-use centers, āfreeā areas of choice materials recoverd at the dump, and some cities even have explicitly artist-run recycling programs.
SFās Recology Center
Phillyās Recycled Artist in Residency )
]]>2022-04-01 update: Iāve added a link to my full code for a player + computer opponent version of the PIG dice game to the end.
2022-04-06: added a link to a repo with code for my Tiny BASIC games. I have a better opponent AI now, though still quite primitive.
Lately Iāve been getting into BASIC. I was a kid in the 80s and 90s and I remember those computers that would boot into a BASIC interpreter. I didnāt have one of those but came in contact with one every year or two and played a handful of text games on them. I was aware of some of the commands and syntax, GOTO and the like, and I have looked through the classic BASIC Video Games book a number of times. The ecosystem of BASIC interested me but I hadnāt delved too deeply. Recently I read about Tiny BASIC:
Tiny BASIC is a family of dialects of the BASIC programming language that can fit into 4 or fewer KBs of memory. Tiny BASIC was designed in response to the open letter published by Bill Gates complaining about users pirating Altair BASIC, which sold for $150. Tiny BASIC was intended to be a completely free version of BASIC that would run on the same early microcomputers. āWikipedia [1]
So originally, Tiny BASIC was a specification, not an implementation. The Peopleās Computer Company published a newsletter, almost like a photocopied zine to my eyes, with articles, tutorials, comix, all aimed at the nascent hobbyist computer community. They invited Dennis Allison from Stanford Universityās Computer Science faculty to write the spec.
The magic of a good language is the ease with which a particular idea may be expressed. The assembly language of most microcomputers is very complex, very powerful, and very hard to learn. The Tiny BASIC project at PCC represents our attempt to give the hobbyist a more human-oriented language or notation with which to encode his programs. [2]
The newsletter goes on to describe the motivation for the project, a free implementation of the BASIC language, and the community working on it currently. It specifies what the language could entail, how to solve various problems, a discussion on creating a compiler versus an interpreter, what it will take to build oneās own Tiny BASIC, and a request for feedback and ideas. It also contained some simple BASIC games.
One of the earlier implementations was Dr. Li-Chen Wangās Palo Alto Tiny BASIC, where he may have devised the term copyleft to describe this process of source code being openly shared and modified and re-published. He affixed the notice āCOPYLEFT ALL RIGHTS WRONGEDā when he published it in 1976.
BASIC flourished as a language throughout the 80s and into the 90s. Many versions of BASIC proliferated, and many versions of Tiny BASIC as well, including some that grew into more extended versions, sometimes including the ability to create graphics or sound, rather than just ASCII text.
In fact, the inital Tiny BASIC implementations allowed printing text output but couldnāt receive text string inputs. These were very simple implementations of BASIC as it had to work with low memory usage. They allowed for (integer) variables, subroutines via gosub/return, if-statements (though not if-then or if-then-else), numerical though not char/string input, and not much else!
The allowed statements were:
IF - THEN statement
GOTO #
INPUT var
LET var=expression
GOSUB #
RETURN
CLEAR
LIST
RUN
END
Strings werenāt defined in the notes, nor were āremarksā aka comments. Missing also were for-loops, random number generation, arrays, though some of the Tiny BASIC dialects did add these.
I decided to try my hand at making a simple dice gambling game. Where I grew up Threelo was a popular dice game, and my friends had our own house rules. But to warm up, I first implemented Pig, a good first game to program due to its minimal actions and easiness of programming. Essentially, each turn you roll a die and add the total to your points. You can stop at any time and keep that total, or keep rolling. If you ever roll a 1 you lose all the points you accrued. Thatās it! Pretty ā¦.(wait for it)ā¦. basic.
I downloaded Damian Gareth Walkerās Tiny BASIC Interpreter and Compiler project written in C. [3]
It packages a man page and some example games (Hunt the Wumpus, Tic Tac Toe, and some others).
Without a built-in random number generator, how was I going to create a random die roll?
Luckily, Gareth published some instructions to construct a minimal not-very-sophisticated random number generator. [4] We donāt have the privilege of referencing the clock of the computer for example, so we follow early BASIC tradition and ask the user for a seed number, then perform a simple calculation. Some other implementations of Tiny BASIC came with a random number generator. Garethās doesnāt by default but does add in the ability to use REM (remark) for commenting.
lemonade.bas running on Tiny BASIC in CoolRetroTerm
So hereās my small and not terribly fun or sophisticated game of Pig. No doubt many improvements can be made.
REM --- PIG Dice game test
REM --- Created: 2022-03-30
REM --- Created for cyningstan's Tiny BASIC
REM --- No one will want to use this code, but consider it public domain CC0.
REM --- Variable List
REM
REM R - A random number returned by the Random Number Generator
REM D - Current die roll
REM T - TOTAL SAVED
LET T=0
REM --- Initialise the random number seed
10 PRINT "Enter a number:"
INPUT R
REM --- MAIN GAME LOOP
20 PRINT "YOUR CURRENT TOTAL IS ",T
PRINT "Would you like to roll? (0 no, 1 yes)"
INPUT Q
30 IF Q=0 THEN GOTO 300
REM --- CHECK IF BUSTED
GOSUB 200
LET D=1+R-R/6*6
PRINT "YOU ROLLED ",D
IF D=1 THEN GOTO 50
LET T=T+D
GOTO 20
REM --- BUSTED!
50 PRINT "BUSTED!"
LET T=0
GOTO 20
200 LET R=5*R+35
LET R=R-R/6547*6547
RETURN
300 PRINT "YOUR SCORE IS ",T
PRINT "GOODBYE"
UPDATE: Iāve added a link to a 2 player version (player + computer opponent) I made of Pig dice to the links at the end. Source code license: COPYLEFT ALL RIGHTS WRONGED, natch. There are likely errors as this is the first time Iām programming in any BASIC, and I made this in less than an hour, and sure enough, I was making spaghetti GOTO code! But you are welcome to improve or build upon this in any way you like.
From here, it would be relatively easy to create some version of Blackjack with a little bit more effort. Of course, it would also be fun to make text adventures, pizza-ordering calculators, the classic text game Lemonade Stand, or a version of my favorite BASIC game TI-83 calculator version of Drug Wars/Dope Wars (thatās a story for a different time). To play the game youāll need to download Tiny BASIC. Save the program with a .bas extension. The computer opponent sorely needs better AI than what Iāve implemented.
Thereās a procedurally-generated maze / dungeon adventure game, like a simplified text RPG, called Kingdom of the Lyre that was made for and entered into PROCJAM in 2019. [5] I played it. Itās challenging and I thought it was fun. Iād love to have a Tiny BASIC game jam one day.
To wrap up, Iāll leave you with this WANTED ad from Volume 1, Number 1 of Dr. Dobbās in the section on My, How Tiny BASIC Growed,
WANTED: Entirely new, never before seen, Tiny Languages, imported from another planet or invented here on Earth. Especially languages for kids using home computers that talk to tvs or play music or run model trains orā¦
Wikipedia article on TinyBASIC
Dr. Dobbās Journal of Computer Calisthenics and Orthodontia, January, 1976 - PDF
Tiny BASIC Interpreter and Compiler Portal
Minimal random number generator for Tiny BASIC
Kingdom of the Lyre game
Tom Pittmanās Tiny BASIC User Manual
my Tiny BASIC games repo with a better computer opponent for PIG dice game