More Notes on ds106 Clean-Up

I think the worst of the ds106 hack is behind us, and thanks to Wordfence managing most of the heavy lifting it was not all that bad in the end. Now I’m just going through and pretty ruthlessly removing plugins and themes that are outdated, removing users, and seeing where old PHP code that never made the jump from 7.X. to 8.X can be cleaned up or removed. I enlisted Tom Woodward for the PHP clean-up in a few themes and plugins, and I also paid for an updated version of the Toolset Types plugin, which drives the Assignment Bank -it was the easiest short term solution until we can figure out a way to replicate that functionality without a premium plugin. As mentioned in my last post on the hack, I also bit the bullet and purchased the latest version of the premium theme Salutation so that it was also PHP 8 compliant.

At this point I think we’re in a pretty good shape. The only issue after the clean-up has been a result of my zeal to remove plugins and themes. I deleted the 960bc theme Tom had already cleaned up (oops!) as well as the Daily Blank theme that’s driving the Daily Create. I restored a version of that theme that still had some left over cruft, so finally did what I should have done in the first place: download a clean copy of the Daily Blank from Github.

My idea is to try using the MU Migration plugin I’ve heard great things about from Charles Fulton to pull the Daily Create out of the ds106.us multisite into its own instance. That’s currently the most active of the sites, and exploding some of these sites from the multisite at this point will make it easier for me to work through a full blown flat-file archive/migration of the main ds106 syndication site that has almost 100,000 posts. So, the upside of all this is it provides a great excuse for a bit of experimentation with a new migration tool as well as some more archiving in the vein of UMW Blogs.

Speaking of experimentation, the whole reason I was playing with ds106.us that ultimately led to the discovery of the hack was to test out the WP Offload Media plugin I am running on this blog. It’s up and running on ds106.us as well, and serving media for the main ds106.us site as well as the assignment bank without any issues. The other test was to see if it works cleanly with a mapped domain, and this led to getting the Camp Magic MacGuffin domain re-mapped. Mapping a domain on a WordPress Multisite that’s running in Reclaim Cloud (where ds106 lives) is pretty easy, we even have documentation for it. The one thing I had to do was add the following line to the wp-config file given I was having cookie issues when logging into the admin of https://magicmacguffin.info/ :

define('COOKIE_DOMAIN', false);

Adding the above line in wp-config.php solved the login issues, and Camp Magic MacGuffin is once again mapped, so I can now further test how media that is offloaded to S3 works on a mapped domain.

While all the media in the media library has been offloaded to S3, I have not removed the local media given I’m not a very trusting person when it comes to losing files. That said, I did notice that existing files are still loading over the subdomain macguffin.ds106.us rather than being pushed to the domain alias on S3: files.ds106.us. I could do a rewrite in the database to ensure they load over files.ds106.us, but it’s odd given the assignments.ds106.us subdomain does push all images to files.ds106.us automatically. This is something I’m going to have to test more, but good to know that all the media files are in S3, and I can find and replace if I have to. But I do want to understand why they don’t automatically switch on this mapped domain site.

So, I guess we are far enough through the hack that I can get back to some experimentation, that eventually, will lead to another hack 🙂

This entry was posted in AWS, digital storytelling, s3, WordPress, wordpress multi-user and tagged , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.