PMRacing
PMRacing GRM+ Memberand UltraDork
12/17/22 6:53 p.m.

I have two websites for my past racing adventures. TeamMatherMotorsport.com and PhilMatherRacing.com. I really don't use them anymore, however, I don't want to delete the domains in case I want to use one of them for business in the future. 

I can't find a way on my domain service (yahoo business now Turbify) to just keep the domain.  I also don't want to delete the content. Is there a way to download the page? I think I still have everything I used to create them in the first place.

Thanks!

aircooled
aircooled MegaDork
12/17/22 7:58 p.m.

Those look to be pretty simple sites.  I don't know what you use to update the site, but you should be able to simply copy the HTML files into a folder.  You can even just simply drag each page (click to left of URL in browser and drag to folder) into a folder.

You can also always see them in the Internet Archive, and even go back in history (you click on year and find a date that was backed up and click on that).  The archive sometimes does not capture everything, but might with these sites.

https://web.archive.org/web/20220815000000*/www.TeamMatherMotorsport.com

https://web.archive.org/web/20220401000000*/www.philmatherracing.com

BoxheadTim
BoxheadTim GRM+ Memberand MegaDork
12/17/22 8:39 p.m.

Most domain registrars should allow you to "park" the domain without using it. I personally use Porkbun and Hover, and have "parked" (unused) domains on both.

Is there any specific reason to not simply leave the sites online, maybe on a cheaper hosting plan? That can help for multiple reasons, for starters you don't have the issues with parking the domain, plus you can show that you've been using the domains the whole time in case someone decides to come up with some sort of daft copyright claim.

Pete. (l33t FS)
Pete. (l33t FS) GRM+ Memberand MegaDork
12/19/22 9:41 p.m.

Years back I used a bit of software called wget that, in a true UN*X like style, was a simple command line program to Website GET.  It would download the main page, all content on the page, and all linked pages (and their content) up to a defined limit of recursion, or branched levels, or whatever the correct term is.

 

My Internet connection back then was really flaky, so I used it to whole-download a couple sites so I could peruse them at my leisure.

 

BTW - My computer was running XP, this was a Windows app, but I liked the brute simplicity of command-line functionality.

 

BTW2 - I really miss that computer.  It was a Fujitsu laptop that had a Cintiq screen, you could rotate the screen 180 degrees and use it as a drawing tablet!

slefain
slefain UltimaDork
12/20/22 9:49 a.m.

Way back when I used a program called "Site Sucker" to pull down copies of competitor's entire website to analyze at my leisure. Might still be available.

J.A. Ackley
J.A. Ackley Senior Editor
12/20/22 10:18 a.m.

Since the sites appear not to be mySQL-driven, if you have FTP access, you should be able to just download everything via that.

You'll need to log in to post.

Our Preferred Partners
VUUFLqFnLOvkWgAecXhqcHXMCOLQ40oCVt6A2oKdVAOQSgbyWuBNWfP0DtiO4w0O