9 Years of Reclaim Hosting

On Saturday Lauren Hanks reminded me that Reclaim Hosting celebrated its 9th birthday. I get confused on the official formation date, I oscillate between the 28th of July (which Gusto—our payroll/hr service—notes as my anniversary) and the 23rd—the date I traditionally associated with it in celebratory posts over the years like this one. So I guess am going to make 23rd the formation date and the 28th my first day of work 🙂

Nine years. Crazy to think we are approaching a decade of Reclaim. I mentioned in the 7 year post how Reclaim was 1) growing and starting to dial in a more definitive sense of culture, and 2) imagining a reality where Tim and I were not as central as we had been to start.

Image Image of TV with Reclaim EdTech on screenTo the first point, I think our hires in 2021 and 2022 have really solidified the questions around culture, which have been to intentionally build a team that is rooted in a support mind-set that is informed and reinforced by educational technology. It helps that Goutam, Pilot, Taylor, and Amanda all came from Domain of One’s Own programs, there understanding of higher education and a deep commitment to vision of technology to both augment and transform education is foundational to being able to both dream up and roll-out an Instructional Technology team in a few short months to start 2022. That has been a gigantic shift in Reclaim’s understanding of itself, that said what it means more specifically is still yet to be determined—which makes it that much more fun! It’s a moment where we can explore, experiment, and figure it out, which i believe is a sandbox for all kinds of magical possibilities.

As to Reclaim operating without Tim and I as central, this has been sealed over the course of our ninth year. If you told me a year ago that Tim would be entirely removed from the day-to-day of Reclaim Hosting starting January 2022, I would’ve laughed …. nervously. But that has been the case, between Lauren ruling the Director of Operations position like a boss; Chris taking over infrastructure and truly shining through not only adroitly managing a mighty fleet of servers—but also making them that much more secure; and Meredith stepping up big time on the regular to train everyone on our team to become fluent in frontline support; we’ve all gotten better as a result. And while Tim’s creative innovation at Reclaim is legend, we now have nine well-rounded team members that truly do make Reclaim bigger than either of its founders, and that’s the dream.

As for me, I’m not going anywhere cause Tim now owns the amazing Reclaim Arcade, so Hosting is all I got! It does help that I love it, and I want to keep experimenting with what a marriage of hosting, support, and edtech looks like as we continue on this journey. In this regard, I have to say our ninth birthday marks a moment where we can not only sustain the laser-focused support our community has come to expect, but also provide a broader outreach thanks to Taylor’s community work and Pilot’s Roundup newsletters—we’re now able to think beyond the immediate. This means building on experiments like the OERxDOmains21 conference delivery platform for ongoing professional development (thanks to Tom Woodward and Michael Branson Smith); more experimentation with container-based edtech; as well thinking through how Domain of One’s Own, WordPress Multisite, and Reclaim Cloud represent a multi-level offering for schools to provide a wide range of options as part of our services—all of which remains undergirded by edtech-drive support.

So, as I reflect on our ninth year of Reclaim Hosting I believe we are entering a new phase wherein we have the headspace to experiment more, re-think how our ostensibly unrelated products can be understood as part of a greater whole, all while creating a culture of the possible at Reclaim Hosting that understands educational technology need not be a clarion call for the apocalypse, but an imaginative way to build cool, fun things that make a difference on a human scale. I’ll take nine more years of that!

Posted in reclaim | Tagged , | 4 Comments

Bavacade Update: G07 Capkit, Condor PCB, and Cheyenne Audio

I have been working on and off on the bavacade, and I am pretty close to having every game working perfectly, that said actual perfection continues to allude me. But we need lofty goals and standards, right? -so the hunt continues.

At the end of June I did my first capkit on a G07-CBO chassis that was removed from Robotron. This chassis is now a backup given I put a replacement G07 chassis the Arcade Buffett fixed into Robotron and it works beautifully. After doing the capkit on the bad G07 I tested it on my Condor machine because that also uses a G07 (I have 4 games that do: Astro Invader, Bagman, Condor, and Robotron) and the picture suffers from some waviness at the bottom, so I figured it would be a good candidate for replacement. I tested the chassis with the new capkit on and it was out of sync for about 5 minutes or so when turned on, but once it warmed up it was perfect. I read around and there might be an issue with my capkit—surprise, surprise—so I’m gonna to have to try it again, and if that works I will do another on the original G07 from the Condor that was wavy.

Image of Condor play field

Condor with G07 from Robotron looking amazing

Also, I played a bit of musical chairs with the G07 chassis, so gonna document that here. During testing I moved the G07 Arcade Buffett repaired that was in Robotron into Condor, and that looks awesome.  I then took the G07 in Bagman and put that in Robotron, and that might be the best G07 chassis of the lot. So gorgeous.

image of Robotron screen

Robotron with G07 chassis from Bagman looking good!

So, right now Bagman is without a chassis, but I will rectify that once I re-do the capkit I flubbed. And even if that fails, I have the original G07 from Condor that I can do a capkit on—practice makes perfect—and then take the working G07 from Condor that was in Robotron previously and put it in Bagman, after thatI should be all set until the next one goes down. Also, are you following all of this? 🙂

Another point worth mentioning is that while working on the G07 capkit and testing it on Condor the tube was making a crackling noise, which is not good. This is what they call “arcing” and it means the high voltage from the anode in the tube is somehow jumping. The solutions I have read about suggest taking out the tube and cleaning the tube as well as you can, and then clean it again and again and again. I did a bit of cleaning and it got much better, but there is still a slight crackle so I may have to clean it again and then use some isolation lacquer around the anode. Here is a post on KLOV that describes a similar issue.* I also noticed on the chassis I did the capkit for had issues with the suction cup that connects to the anode hole on the tube. It was not grabbing well. and was in overall bad shape. So, I will also be replacing the flyback on my next attempt, which includes a new suction cup for the anode.

In terms of the Condor, I got the PCB issues with the audio fixed (it was really low and staticky), and turns out it was a bad volume pot (or potentiometer), so that works perfectly and the monitor looks good. The original board, which is now the backup, was having some graphical issues so getting it looked at locally before committing to sending it off to be fixed. Condor will be like new with a capped chassis, a refurbished board, and if the arcing gets out of control, a new tube might be next 🙂

Cheyenne Cabinet Playing Crossbow

The other game that has had some issues is Cheyenne. At first it was the curling of the monitor image that led to getting a backup Polo 20″ chassis (that was itself having issues and went back in for repair, and while I jut got it back today I still need to test it). The original monitor chassis seems to be working well now, but in the interim issues with the audio started to occur. We tested the original speakers and they seemed to have some power-related issues, so they were replaced with new speakers. But despite that there was still no sound and it seems like the issue might be related to some resistors on the sound board that might have gotten damaged. We tried routing around the board based amplifier to test that theory this morning, but we still got nothing, so Roberto took the audio board with him for more testing. The board work continues to fascinate me, and watching Roberto read a schematic has given me some hope, but the fact he doesn’t speak English and my Italian sucks is definitely a challenge 🙂 But the problem seems to be related to the arrow at P8 (closest to the margin in image below), which is the audio connector from the board to the speakers, he tried to by-pass that and tap into the audio signal before that connector to test where the issue was occurring, but the test didn’t work, so he will be digging in more on his bench and I will be cheering him on via email.

Cheyenne Audio PCB schematic

The good news is I have an extra Cheyenne/Crossbow board coming from the US that should arrive any day with the rest of my containerized belongings, so we should be able to test even further with a different audio board, and hopefully solve this issue.

The one last thing I need to do is re-visit Asterock and add the voltage regulator we removed to see if that is fixed because right now we are using a switching power supply and I think that when the voltage regulator goes back in the machine we can have everything run off the original power supply. Will need to look at that this week, as well as doing a capkit or two of the G07 chassis I have in front of me. Unfortunately it has been so damn hot that the idea of doing any soldering-related work has been less than appealing, unless I take care of it at 5 or 6 AM. Nonetheless, this gets me caught up on recording the on-again, off-again work that has been happening the last month or two.

____________________________

*Although I just turned this game on and I am not hearing any cracking, so perhaps this is sorted.

Posted in bavacade, bavarcade | Tagged , , , , , , | 2 Comments

WordPress Multi-Region: Is the Third Time a Charm?

Hope springs eternal in the bava breast. This is the third attempt since November of 2021 to try and get a WordPress Multi-Region setup working on Reclaim Cloud, I blogged attempts one and two in some detail. And I’m glad because returning to it after four short months it’s like I’m starting from scratch, so the blog posts help tremendously with the amnesia—they serve a similar purpose as the polaroids in Memento.

My return to Multi-Region was spurred on by a realization that the documentation provided by Virtuozzo (formerly known as Jelastic) noted the WordPress Multi-Region Cluster installer was a Beta version, whereas the one I’ve been playing with in Reclaim Cloud is still Alpha. This led me to look through the Jelastic marketplace installers (JPS files) on Github to see if there’s more than one installer for WordPress Multi-Region setups, and while I could not find the Beta version of the WordPress Multi-Region Cluster installer, I did find a beta installer for a WordPress Multi-Region Standalone installer. The difference between the two is that the standalone is not creating multiple app servers, databases, etc., within a single environment that is then reproduced across as many as 4 environments in different regions.  This significantly reduces the complexity, which gave me some hope that this just might work.

What’s more, I’m already running bavatuesdays in a standalone WordPress environment using Nginx (LEMP) on Reclaim Cloud, so the difference would be this new instance uses LiteSpeed (LLSMP) and replicates the entire instance in one additional data center (there were not options for more that two regions). It is two stand alone instances that are replicated across two different regional data centers. Here is to hoping simpler is better.

The fact that I am deep into container learning this month helped my confidence a bit, particularly when grabbing the URL for the manifest.yml file that provides the directives for setting this up in Reclaim Cloud. We don’t have the WordPress Multi-Rgion Standalone installer available, but you can still install it by going to the Import option in Reclaim Cloud and copying the manifest.yml URL into the URL tab:

Import tool to grab a manifest.yml file to build out the WordPress Multi-Region Standalone installer

Once you click Import you will be given the options for setting up your WordPress Multi-Region Standalone setup:

You are limited to two regions with this installer, and the first region you list becomes your primary, but I’m not convinced that matters as much as it does with the Cluster Multi-Region setup. After that, you let the two environments setup, and the URLs of each will be something like bavamulti-1.uk.reclaim.cloud and bavamulti-2.ca.reclaim.cloud.

Once the environments are created you will get an email with the details for logging into the WordPress admin area (same for both setups) as well as LiteSpeed and MySQL database credentials. [Note to self: that is an important email, so be sure to save it.] Once everything is setup you need to do a few things to each environment:

  • Make sure you have added two A records for your custom domain. There should be one record for each of the environments public IP addresses. I use Cloudflare for this and it looks something like:

  • Add a Let’s Encrypt Certificate for your custom domain in each environment:

  • Update the site URL in both environments using the WordPress Site Address addon:

If you are starting from scratch then you should be good to go at this point with the setup. I had a few extra steps given I’m importing my files and data from an existing WordPress, and to do this I use rsync to copy files between environments and a command line database import given the web-based phpMyAdmin import was consistently timing out.

Rsyncing files between environments has been a bit of a struggle for me in previous attempts, but luckily I documented this process, and finally feel like I have made a breakthrough in my understanding, although I still had to lean on Chris Blankenship for help this time around. Here are the steps:

  • Create a key pair on the environment node you are migrating from and copy the public key (file ending in pub) into the authorized keys file on the new environment node you’re moving to. Here is the command I used to create the key pair on the environment node I am migrating from:

ssh-keygen -t ed25519 -a 200 -C "jim_at_reclaimhosting.com" -f ~/.ssh/bava_ssh_key

  • Make sure to do everything as root user on the server you are moving to and from, and there’s a Root Access addon for this in the multi-region environment. Also, keep in mind you only need to rsync to one of the two multi-region environments you created, I chose to copy files and import the database to the bavamulti-1.uk.reclaim.cloud environment.

  • For rsyncing I added the keys successfully, did everything as root, and still ran into an issue using the following command to rsync. Turns out that the multi-region WordPress has an application firewall built-in that was blocking access to the public IP address, so I needed to use the private IP LAN address instead, which worked!

rsync --dry-run -e "ssh -i ~/.ssh/bava_ssh_key" -avzh /var/www/webroot/ROOT [email protected]:/var/www/webroot/ROOT
If no luck, try:
rsync --dry-run -e "ssh -i ~/.ssh/bava_ssh_key" -avzh /var/www/webroot/ROOT [email protected]:/var/www/webroot/ROOT

    • After that I needed to access phpMyAdmin on the old site and download the database and then upload it to the bavamulti-1.uk.reclaim.cloud environment. I tried importing via the phpMyAdmin interface but it timed out, even when I compressed the file it was still taking too long. So I uploaded the sql file to the new multi-region environment and used the following mysql database import command:

mysql -u username -p new_database < data-dump.sql

And that worked perfectly and everything was imported and the site was immediately running cleanly in its new home. I checked if the files and data had been replicated to bavamulti-2.ca.reclaim.cloud (the Canadian based environment) and I was happy to see that it happened almost instantly. So you only need to import to one environment and all files and data are immediately copied to another, which is exactly how you want multi-region to work.

The next test was turning off one of the other environments to see if the site stays online, and that worked as well. So far it looks like a success. One of the issues I had with the Multi-Region cluster was getting new comments and posts to populate cleanly across all regions if one of the environments was down during the posting, so will need to test that on the comments of this post, while also making sure one of the servers is down when I publish this.

I decided to re-visit some of my previous work in Cloudflare setting up a load balancer and monitoring the two environments to accomplish a few things:

  • Ensuring that if one of the two servers goes down I am notified
  • Steering traffic so that visitors closest to the Canadian server get directed there, and visitors in Europe get pushed to the UK server.
  • Testing load balancing to ensure if one of the two environments goes down the online server is the only available IP so that there are no errors for incoming traffic

All of these details are laid out beautifully in this post on the Jelastic blog about load balancing a multi-region setup, so much of what I will be sharing below is just my walking through there steps.

In Cloudflare under Traffic–>Load Balancer for a specific domain you can create a load balancer that allows you to define pools of servers that can be monitored so that when downtime is detected you not only get an email, but they redirect all traffic from the server with an issue to a server that is online.

Cloudflare load balancing

In the image below you can see that the UK environment is reported as critical, meaning it’s offline. In that scenario all traffic should be pointed to the healthy server in Canada.

View of a server with an issue in Cloudflare load balancer

I can also confirm the email monitoring works:

Example of a monitoring email from Cloudflare notifying me of issues with a server

And I did a test to ensure when both servers are online both IP addresses show up:

dig bavatuesdays.com -notice that when all servers are healthy all IP addresses are listed

And below is a test when one server goes down, in this case the Canadian server, only the UK server IP address shows as available, which means the load balancing and fail over are working perfectly:

dig bavatuesdays.com when one of two servers is down, notice just one IP address shows, the one that works!

That is awesome. I have to say that Cloudflare is quickly becoming my new favorite tool to play with all this stuff. And Cloudflare in conjunction with the standalone WordPress Multi-Region is a powerful demonstration of how much Cloudflare can do with DNS to help you abstract your server setup to manage failover and multi-region load balancing seamlessly.

The final thing I’m playing with on this setup is using Traffic Steering in Cloudflare, which allows me to locate a server by a rough latitude and longitude so that Cloudflare can calculate how close a visitor is to which IP and steer them towards the closer server. In this way, it is essentially geo-locating visitor traffic to the closest server, which is pretty awesome—although i am not sure how to test it just yet.

So, by all indicators the third time my very well be a charm, and simple is better! But the question remains if this post will populate across both servers when published with one down, and if comments will also sync once the failed server comes back online. I’ll report more about this in the comments below….

Posted in Reclaim Cloud, WordPress | Tagged , , , | 4 Comments

Understanding Containers through Nextcloud

We are into week 3 of our Reclaim Edtech flex course “Understanding Containers,” and I have to say Taylor Jadin is doing a bang-up job taking folks through the basics. Yesterday was the premiere of week 3’s video that covers using load balancers, mapping domains, installing SSL certificates, and more. In week 2 we went through installing Nextcloud and it all started in week 1 with a broader framework for understanding containers as well as getting familiar with Reclaim Cloud. Taylor’s pacing has been excellent, and his weekly videos are accompanied by a weekly blog post with all necessary resources as well as a to-do list. The way he has set it up has a very ds106 weekly assignment vibe, and I am loving it.* I’m also loving Taylor’s Container Glossary, which provides a nice guide for understanding the basic terminology and concepts undergirding containers.

So, this week I sat down to catch up on my Nextcloud work, so what follows are mostly notes for me, but if they come in useful all the better. I used Taylor’s basic docker-compose.yml file to spin up a very basic Nextcloud instance within a Docker Engine node on Reclaim Cloud.

You would copy this docker-compose.yml file into a directory like ~/nextcloud/ and be sure to change the port from 8080:80 to 80:80. Then run the following command:

docker-compose up -d

At that point you’ll have a basic SQLite driven instance of Nextcloud on the default Reclaim Cloud domain, something like nextcloud.uk.reclaim.cloud. I also wanted to get a separate instance running MariaDB spun-up given that works well for larger setups and syncs with various devices more seamlessly. To do this you can either spin up a new Docker Engine node in a separate environment (which is what I did for testing purposes), or just replace the contents of the existing docker-compose.yml with the directives for creating a Nextcloud instance that uses MariaDB.

To do this you need to completely remove the existing containers from the original instance using the following command:

docker-compose down -v

The -v is import  in the above command because it not only spins down containers, but entirely removes them. From there I go back into ~/nextcloud and edit the docker-compose.yml file replacing what’s there with these details (be sure to create your own passwords):

Image of docker-compose.yml file for a MariaDB Nextcloud setup

This is the docker-compose.yml file for a MariaDB Nextcloud setup

Once you update the docker-compose.yml file with the new details and passwords, being sure, once again, to change the port from 8080:80 to 80:80. Save the file and run the following command to spin it up:

docker-compose up -d

After that you should have Nextcloud running on a MariaDB instance. Go to the reclaim.cloud URL and setup the account.

Once you have done that and you want to map a custom domain you will need to add a Nginx load balancer to the environment, ensuring it has a public IP address. After that, grab the public IP address and use it to point an A record for a custom domain, something like cloud.bavatuesdays.com.

Once that is done you can remove the public IP address from the Nextcloud node (not the Load Balancer). From now on the load balancer will provide the public IP, so the IP originally associated with the Nextcloud node is no longer of use so no need to pay for it.

There are 3 more things to do: 1) add a Let’s Encrypt Certificate using the Load Balancer addon and specifying the mapped domain; 2) redirect to  SSL using Nginx, which Taylor blogged; and 3) ensuring your mapped domain is recognized by NextCloud by editing the /var/lib/docker/volumes/nextcloud_nextcloud/_data/config/config.php file to include the custom mapped domain like shown on line 25 below:

Image of the config.php file that needs to be edited to include the domain name

config.php file that needs to be edited to include the domain name, this was for my SQLite instance

Once you do these edits be sure to restart the respective nodes in your environments. I was able to get both the SQlite and MariaDB instance up and running, and it’s worth noting the MariaDB environment uses 8 Cloudlets (roughly $24 per month) versus the SQLite instance using 4 Cloudlets (roughly $12 per month).

Ok, that’s my Nextcloud progress thus far and I understand there may be some gaps in the notes above, so feel free to ask any clarifying questions in the comments.

______________________

*Most of us at Reclaim are struggling with not sharing these immediately after they are produced given they are currently part of our subscription model for Reclaim Edtech, but whether or not that continues to make sense as a model remains a question.

Posted in docker, reclaim, Reclaim Edtech | Tagged , , , , | 1 Comment

A MBS Recommendation

Last week I had the privilege of writing a recommendation for Michael Branson Smith to gain admission to the masters program in Interactive Data Visualization at the CUNY Graduate Center. And while I’m confident he will get into the program with or without my recommendation, I used the occasion to try and capture in some detail our collaborations over the last 10+ years because I think it captures some of the coolest work I have been involved with. I have said it before and I will say it again, collaborations have been everything since I saw the light with ds106, and they continue to be the magic that makes so many fun, cool things happen. While writing this I realized just how MBS’s vision has enriched so many of my projects, and how I kept coming back for more. Keep your enemies close, and keep the artists even closer! So, here is a record of my letter of recommendation for MBS, who I have no doubt will continue to do amazing things with or without me or a degree, but why not both anyway? 🙂

___________________________________________________

To the admissions committee:

I think the hardest thing about writing this recommendation will be trying to reign in the endless superlatives that come to mind when discussing Michael Branson Smith’s (MBS) seemingly endless stream of creative genius—there I go already—in the more than ten years I have had the privilege to collaborate with him on a wide range of projects. Given this is a recommendation for a graduate-level course of study, I will try and touch on numerous projects to not only give you a sense of the breath of MBS’s work, but also reinforce how much it has consistently been building towards the project he will be focusing on as a part of your program.

It was during my time as Executive Director of Teaching and Learning Technologies at the University of Mary Washington (UMW) when I first came into contact with MBS. It was the Spring of 2011 and the open, online course Digital Storytelling 106, often referred to as ds106 (http://ds106.us), was being taught at UMW as a possible vision of a forward-thinking model for online learning premised on networked communities of practice and learning. As founding faculty member of the Communication Technologies program at York College, MBS integrated his introductory level courses into this distributed network of digital storytelling as a means to rethink the ways in which the students grappled with the emerging realities of digital literacy and identity, placing him at the forefront of an avante garde of educational technologists and educators attempting to grapple with the impact these emergent web forms are having on teaching and learning.

His work in ds106 led to a string of generative projects wherein he was learning and building right alongside his students—arguably the most honorable of pedagogical traits. His art training in many ways defined the aesthetic of ds106 when it came to the emergent reclamation of the GIF as web artform. I very much consider MBS a GIF artist, amongst other things, and his Hitchcock Animated Movie Posters speak to that more precisely than I ever could (https://www.michaelbransonsmith.net/blog/alfred-hitchcock-film-posters-animated/). But beyond his early vision for the GIF as art, he also understood the form as integral to a deeper appreciation and study of film as medium. He saw the GIF as a tool that could be used by faculty and students alike to capture and linger on certain moments of film that can be read more closely as a result of the technology, and in many ways this attribute might be what is most remarkable to me about MBS, namely his ability to simultaneously fuse his art, pedagogy, and research.

Image of North by Northwest movie poster animated

MBS’s North by Northwest animated movie poster

In the Spring of 2015 our creative collaborations continued, but this time moving from the future of teaching and learning with virtual communities like ds106 to the past in the form of a 1980s Console Living Room exhibit (http://consolelivingroom.net) at UMW. This reproduction of a 1980s living room, replete with period-specific furniture and technology, placed within the context of a newly opened, state-of-the-art Technology Convergence Center as a way to connect the technology of the early 21st century with that of a particular moment in the rise of game consoles, video home entertainment systems, and network television. MBS’s contribution to the exhibit was both the vision and expertise to program an entire day’s worth of network television (importantly including period specific commercials) that would then, using a combination of Raspberry Pi savvy and short range television frequency know-how, broadcast over-the-air to numerous televisions in the exhibit space simultaneously.

It was the finishing touch that truly brought the entire exhibit to life by underscoring the dominance of network television in the living room of the mid 1980s as well as reminding us of the power of wifi analog broadcasting spectrums that have all but been converted to digital. MBS wrote extensively about his development of this project on his blog, and it is well worth a read for a deeper looking into both his creative and technical process: https://www.michaelbransonsmith.net/blog/an-mbs-special-presentation/

Our next collaboration in 2018 on the Reclaim Video (http://reclaimvideo.com) site was interesting because it captures MBS’s transition towards deeply sharpening his programming skills in HTML, CSS, and Javascript (HTML/CSS/JS), which was in line with his shifting focus from teaching ds106 to building the core coding competencies for the Communication Technologies program at York College. This project was in many ways an extension of the Console Living Room, but this time a re-creation of a 1980s video rental store called Reclaim Video. We enlisted MBS to design the site and his work at this time foreshadows the time-based/scheduling work he would do with the Youtube API for the OERxDomains21 conference site, but more on that soon.

Image of Reclaim Video Site

MBS’s Reclaim Video site design

I think this example is important not only because it highlights his ridiculously gorgeous animated movie posters as well as  his brilliant sense of design, but also a moment wherein he starts to dig deeper into the coding which would be a multi-year odyssey to not only make the academic program at York College that much better, but also to prepare for the work that would ultimately lead him to applying to the Grad Center. In fact, I wrote extensively about this project in a post titled “A MBS Special presentation at Reclaim Video” (https://bavatuesdays.com/a-mbs-special-presentation-at-reclaim-video/), and it might be of note here given it is also the first time I heard MBS discuss his broader vision for the subtitle, dialogue analysis project he hopes to work on as part of the Interactive Data Visualization program.

MBS’s deep dive into HTML/CSS/JS would result in an interesting return to his animated movie posters from many years earlier with a remarkable twist: shifting the animation from the format of the GIF image to the underlying code of the website, namely CSS. It’s both a literal and metaphoric transition of MBS’s art from primarily that of the image to that of the underlying code that makes the image possible. It is a journey to the center of an aesthetic, and I have to say his code-based animated movie posters are some of the most brilliant examples of CSS-based popular media art I have seen on the web. Take a look for yourself at his strictly CSS-based animated rendition of the Dr. Strangelove poster, and tell me I’m wrong: https://mbs.nyc/posters/dr-strangelove/ And if you want to hear more about the process, we recorded a video in 2020 wherein he discusses the detailed process of creating these posters purely from code: https://today.reclaimhosting.com/podcast/026-anatomy-of-css-animated-movie-posters/

Again, I point this out to highlight the amount of focused, self-taught design and programming skill that he has picked up during the more than 10 years I have had the privilege of working with him.

Image of the OERxDomains21 Schedule

MBS’s TV Guide-inspired design for theOERxDomains21 Schedule

And while all of these projects speak volumes towards MBS’s qualifications for your program, I do think his work on the OERxDomains21 conference (https://oerxdomains21.org/day1.html, https://oerxdomains21.org/day2.html) highlights a unique synthesis of both his design and coding expertise in one innovative, elegant, and highly functional site—not to mention all accomplished under the duress of a ridiculously short deadline. The OERxDomains21 site was essentially the conference platform that used a TV Guide-inspired layout to drive a time-based program for integrating and playing both live and pre-recorded YouTube videos. And while all the data is entered using a WordPress site for session details, time, tracks, participants, video ID, etc., that information is being pulled in an HTML/CSS/JS frontend separate from WordPress (referred to as Headless WordPress) so that it could take advantage of a more sophisticated design that would allow for working with the YouTube API to ensure three separate channels of videos are playing across that platform simultaneously while ensuring that everyone is always able to sync at the same time to keep a consistent sense of “live” presence. It is effectively trying to realize the physical work of the Console Living Room project virtually for a conference site that is built around the idea of network television from another age. MBS describes his process with wrangling the YouTube API as well as wrapping his head around the challenges of temporality and timezones in order to build this platform in this post (https://www.michaelbransonsmith.net/blog/2021/04/29/this-is-temporal-experiment-number-one/), and to be honest it’s all quite complex to me, but what I can attest to as one of the conference organizers is that the experience was truly transformative as a means for delivering a simultaneously live and asynchronous event across multi-channels over the course of two days. So much so that both Reclaim Hosting, my humble organization, and the Association for Learning Technologists professional organization in the United Kingdom have continued to use this platform to deliver events since.

Last, but not least, I was thrilled to learn that MBS’s idea for building a tool for visualizing word diversity in top-grossing films found an outlet thanks to his participation in an Interactive Data Visualization course at the Graduate Center this Spring. It’s a project many years in the conceiving, and the fact MBS is further diving into javascript libraries and learning how to program in Python points to the further maturation and sharpening of an already remarkable set of skills. I’m sure your familiarity with this most recent work will be grounds enough for admission given his detailed and eminently achievable plans to further fine-tune his work to facet search by speaker and leverage existing tools to build a queryable API for searching dialogue represents an invaluable resource for visual media researchers.

Image of Investigating Dialogue i Top Ten American Films site

MBSs Investigating Dialogue i Top Ten American Films Tool

But for me the real reason to accept MBS into your program is his ongoing demonstration of marrying a strong will to learn the complex technical frameworks to build creative and compelling experiences in service of a broader vision of exploring the power of popular media to shape how we understand the particular “channel” of the world we are watching. I cannot recommend MBS for your program strongly enough, he is not only a talented artist, designer, and programmer, but also an absolute joy to collaborate with in every sense of that word. The Interactive Data Visualization program will be richer in every way as a result of his acceptance and matriculation.

Posted in blogging, Console Living Room, digital storytelling, film, pop culture, Reclaim Today, Reclaim Video, Teaching | Tagged , , , , | 4 Comments

Vinylcast #53: Built to Spill’s Perfect from Now On

I was going through my vinyl the other day and after listening to Built to Spill‘s 1997 masterpiece Perfect from Now On I knew it wanted to make it my next #vinylcast. I have both the 1997 version as well as the 2007 re-release, and this was the re-release given the original vinyl I have is in rough shape from over-playing it in the late 90s. The difference is that the 2007 version is the fourth side of this two-disc album actually has the b-side song “Easy Way” which was not originally released with the album in 1997. I didn’t have time to play it during this broadcast and I liked the idea of keeping it to the original as well, I’ll try and get an addendum recording up at some point.
Audio of Vinylcast #53: Built to Spill’s Perfect from Now On
Anyway, like with Yo La Tengo’s And then nothing turned itself inside out I was able to cross-cast between ds106radio and Reclaim Radio while also streaming the spinning record live to bava.tv thanks to the #vinylcam. I do like having my own combination TV/Radio station so very much. I’m at the bava.tv, what/ I’m at the bavaradio, what? I am at the combination bavatv/radio!

bava.tv #vinylcam view of Built to Spill’s Perfect from Now On

Posted in ds106radio, ds106tv, on air, vinylcam, vinylcast | Tagged | Leave a comment

Karaoke Czar

I have been marrying my recent experimenting with Peertube live streaming to karaoke; two great tastes that taste great together 🙂

https://bava.tv/w/2aX4PEbuT59VHJL1dq2PT3

Posted in bava.tv, karaoke | Tagged , , , | Leave a comment

The Blogsphere is Hot….with Edtech Angst

The above video has been the source of an ongoing joke at Reclaim Hosting for many years now. I’ve been carrying the blogging torch and this has been my referent point for a time when blogging was so popular and “hot” it was actually the butt of elaborate video jokes. Well, nothing gold can stay, or can it? I’d hate to jinx it, but the blogsphere truly is hot these days, unfortunate’y the spark seems to have been Audrey saying goodbye to edtech —everything comes at a cost. But hey, maybe blog vinyl is back?

https://twitter.com/KateMfD/status/1539797508645601281

That might be wishful thinking, but this morning I’m still making my way through the seemingly endless examples of amazing edtech at TRU thanks to Brian Lamb’s opus “A Trailing-Edge Technologies Share-a-thon,” that post goes a long way to remind me why I fell in love with this field to begin with. But that is just the most recent post, there is Anne-Marie Scott’s recent blog chorus “from little things big things grow” should be the tagline of the “Summer of the Blog”—I’m a big fan! And then Alan did what Alan does and mashed up 70s horror film posters with the recent spate of edtech corporate shenanigans while asking “Who we are?” Are we not edtechs? WE ARE TRAILING EDGE LION PUNCHERS*! I really appreciate Alan’s lingering a bit longer on the news cycle of folks turning their back on their roots for what can only be understood as profit. And that’s just a few posts, there are many, and something strange happened this week at work, folks at Reclaim Hosting are linking to these posts and talking about them. The idea, as Martin Weller noted in his “Review of the ed tech angst,” of an edtech community of practitioners felt real, and I felt excited that maybe my ongoing “the blogsphere is hot” joke might be grounded in some reality, even if only for a moment.

via GIPHY

What’s more, Martin’s post inspired something in me I had not felt in a long time, the desire to discourse. The idea that someone said something in a blog post you want to respond to, but not in 140 characters and no with a like or heart, but right here in the bava. Whether or not this is a good thing may still be a question 🙂

Anyway, one of the things Martin noted about a couple of posts I wrote was that I might be coming off “like grumpy old music fans who preferred a band before they sold out.” I understand where he is coming from, that there might seem a bit of snobbery in me suggesting that folks traded cachet for cash, but at the same time it’s what happened. I mean Lumen did offload OER courses to Course Hero in what we can only assume was a deal with a questionable company. I don’t think I am exaggerating here, and the schism in the open community that has been happening for years is no longer being whispered at parties, it’s pretty apparent for all to see. I have never been a fan of the OER movement, it has monopolized most of the grant money and oxygen in the broader field of edtech using a series of what appear, in retrospect, cynical frames around affordability and access. But I also struggle with how narrowly OER are defined as open-licensed textbooks, Downes did a great job several years ago pushing back against that frame and that is a future of open resources that is far more compelling and relevant.

The bit that stuck in my craw a bit from Martin’s post was the idea that “a lot of new ed tech people are driven by values, such as social justice, rather than an interest in the tech itself.” I don’t discount this, and speaking just as anecdotally I can see it in the next generation of folks working at Reclaim Hosting. That said, this idea that understanding the tech and remaining interested it what it affords is somehow different than being a critical participant paints a myopic picture that the previous generation of edtechs were simply navel gazing around the latest tools. I’m not sure that’s the case, in fact Brian Lamb’s linking to our “Never Mind the EDUPUNKs” article in his latest post reminded me that understanding and being familiar with the potential of the tech and how these infrastructures work was a source of critical power. And the seemingly false dichotomy between engaging the new tech and being able to remain critical seems to suggest our jobs as ed techs is not about the tech, which seems odd to me. Now I may be reading too deeply into this, or even carting my own baggage here, but I feel like my job as an edtech is to understand the larger shifts technically and culturally so that we can work with faculty and students to provide options that enable empowerment. The risk of critical edtech devolving into malaise of critique without alternatives has never been greater. In fact, seems to me the cynicism in our field is not limited to OER given the leaders of the digital pedagogy movement centered on social justice have also re-framed their mission as one of token critical voice inside the corporate machine. Good ed tech is like good reading, you have to engage the technical  and pedagogical work, try and understand it deeply, and then critique as part of the larger landscape while being honest about its affordances at the same time. It is a balancing act that can too easily devolve into “it’s not about the tech…”

All that said, I understand righteousness can come at a great cost, but I find you only have to pay when you actually sell out 🙂 Long live the bava.blog!

____________________________________________

*Is that a blog comment conversation I spy? I am sure Tom Woodward doesn’t know what to do with himself when anyone else but Alan leaves a comment 🙂

Posted in blogging | Tagged , | 10 Comments

Installing Manifold on Reclaim Cloud

Have I mentioned it’s container month at Reclaim Edtech recently? Well, it is and that means I’m playing with installing apps, or even re-visiting apps we’ve already gotten running, which is the case here. Tim Owens already documented the process of getting the open source publishing tool Manifold up and running in Reclaim Cloud. I know this application is of interest to some of us at Reclaim, so I wanted to give it a go this morning. I documented the process in a how-to video because, you guessed it, it’s container month!

Posted in Reclaim Cloud, Reclaim Edtech | Tagged , , , , | Leave a comment

Reclaim Radio 2.0

Just over two years ago I wrote about spinning up a work experiment called Reclaim Radio using Azuracast. We used it irregularly and eventually it died on the vine a bit, especially given I was so used to broadcasting through ds106radio by default. But recently Lauren Hanks has been exploring ways of keeping our fully remote team connected in some fun ways, and there was mention of a weekly radio day where folks would create the soundtrack for others on that day. There are probably a million ways to do this, but given it is container month at Reclaim Edtech (have a mentioned that recently?), I decided to jump back into Azuracast and get Reclaim Radio back up and running.

A lot has changed in Azuracast in the last two years given it’s vigorously developed and supported, so I made the executive decision to delete the previous instance and start from scratch. This was possible because there were next to no files uploaded to the server. We do have a one-click installer on Reclaim Cloud for Azuracast—and that is ultimately what I ended up using*—but you could also install within Docker Engine using the guide here or explore the custom image on Docker Hub, which I haven’t had luck with.

Anyway, it’s been a while since I started from scratch, and one of the changes is that you can get an SSL certificate from within the web interface, which is nice. The other that threw me off was that by default AutoDJ won’t spin up unless it has something to play…you need to give it a single default track to loop through and that will satisfy that requirement and make sure everything spins up. This had me stumped for a bit, but a trip over to the Azuracast Discord for help solved this one, grazie Buster!

Image of Reclaim Radio homepage

Reclaim Radio 2.0

So, after that it was smooth sailing and I had Reclaim Radio up and the custom listen page working cleanly given the domain remained the same. The crazy piece there is that I stole the homepage for our listen/player form Taylor Jadin two years ago, and two years later he’s not only working for Reclaim, but running the Container workshop this month #4life. So, after getting it running I needed to test my simulcasting to two stations using Audio Hijack, and that worked quite well:

Image of Audio Hijack for Reclaim Radio ds106radio simulcast

Audio Hijack for Reclaim Radio ds106radio simulcast

If anyone is interested I can break down what you see above in more detail, but quite simply I am running my microphone, Google Chrome, and my turntable (USB Audio Codec), through two broadcast blocks, namely Reclaim Radio and ds106radio, before it records the audio, which I am monitoring through the audio jack in my Elgato Wave microphone. Getting this working was rewarding, and I soon after did a late night broadcast wherein I looped in a third stream, namely a #vinylcam cast through bava.tv so you could both see and hear the record spinning 🙂

That was fun, and the following day in the water cooler Slack channel for Reclaim Pilot Irwin shared some music they are listening to, and that got me super excited about Reclaim Radio, so during the edtech meeting that morning we played with the WebDJ feature of Azuracast, something I have not had luck with previously, but this time it worked.

Image of Azuracast's WebDJ interface

Azuracast’s WebDJ interface

The great Scottlo was/is a fan of the WebDJ given it eliminates a ton of overhead to getting online and broadcasting with apps like Ladiocast, Audio Hijack, Mixxx, etc. All you need are credentials and a mic and you can start uploading songs to playlists and mixing in music and with your audio. And thinking through this yesterday in the edtech meeting, with a tool like Loopback that virtualizes a microphone so you can mix together several applications, you could then use that virtual mic in WebDJ to broadcast Skype calls, Zoom meetings, and just about any other app on your computer you can think of. I am pretty excited about this, and hopefully it will become a regular thing at Reclaim. But if nothing else, it’s always valuable for me to spend time getting comfortable with the ins and outs of Azuracast given ds106radio isn’t going anywhere anytime soon.

________________________

*It throws an error during installation that might make you think it did not install correctly, but it did. We do have to fix that.

Posted in ds106radio, reclaim, Reclaim Radio | Tagged , , , | 2 Comments