By aaron.axvig, Tue, 06/16/2020 - 11:13

This went OK.  I had some tangles with incompatible modules, almost surely my fault for doing things wrong. Fixing it was a good way to become more familiar with composer and the Drupal module system!

To get Drupal upgraded I edited the composer.json file manually to remove the Matomo module, which was a bad idea.  I suppose it was using its versioning info to signify that it was not compatible with Drupal 9.  So once composer no longer knew that I wanted the Matomo module it was happy to upgrade to Drupal 9.  But Drupal's internal config still knew that I had the Matomo module installed.  Eventually I figured out how to uninstall with the Matomo module with drush, but only after figuring out a workaround for a bug with the module uninstaller.  BTW the Matomo code was no longer in the modules/contrib folder on my server and composer refused to install it since it was incompatible with the already installed Drupal 9, so I had to manually download the Matomo module code into that folder in order to be able to run the drush command to uninstall the module.

Then I found that I had similar problems with two other modules: popular_tags and tagclouds.  Fortunately I WAS able to add and remove those at will using composer as I was learning that I needed to uninstall them using drush, and they did not have any uninstall errors.  I actual wrote this up backwards--I think I figured out the Matomo module after solving the issue with these other two.

It would have been a lot easier to disable/uninstall those modules in the Drupal admin pages before upgrading!


By aaron.axvig, Tue, 03/31/2020 - 16:47

I was recently helping someone with a transition from Dropbox to SharePoint/Teams/OneDrive for Business.  They were running into issues with filenames.  As an all-Mac business there were many files they had created with colons in the file or folder name.  Windows doesn't allow those characters and SharePoint does not appreciate them either.

Some Googling suggested that downloading a ZIP of the Dropbox folder might solve the problem.  I found that when I extracted the ZIP the offending files were just missing.

I ended up creating a droplet on Digital Ocean and syncing the Dropbox folder to it.  Even though it was 20GB and 14,000 files, the sync only took three or four minutes!

Then I set about carefully renaming things.  Thanks to StackOverflow I mainly worked with variations of this command: find . -type f -name "*:*" -exec rename -n 's/:/-/g' {} + It renames all files that contain a colon by replacing that with a dash.  If you have directories that contain a colon then it will fail to rename those.  Run again with -type d to rename those.  Remove the -n to actually make the changes; with -n it just tells you what it would do.  Append | wc -l on the end if you want to count how many issues you have.

Towards the end I was just using find . -name "*:*" to do final checks.  And I checked all the invalid characters.  Some of them require \ as an escape character; for example find . -name "*\?*"

Then you may find out other odd things.  For example, file and folder names cannot start with a space.  find . -type f -name " *" -exec rename -n 's/\/ /\//' {} + can help with that.

For the actual sync I setup Dropbox and OneDrive for Business on the same machine (enable long file names).  Then I used robocopy /MIR to copy files into the OneDrive folder.  After my first run of that, OneDrive started syncing as expected.  Then it got upset about some filenames, which is when I realized that there are additional characters not allowed in SharePoint that are OK in Windows.  It offered to fix those, replacing the characters with an underscore.  Then I found out about the spaces issue and fixed that back on the Linux droplet.  After Dropbox synced I ran robocopy /MIR again to put in those new fixes.  A significant problem with this method is that OneDrive for Business changes the filesize on some Microsoft filetypes so if you have a large amount of those types of files it might cause difficulties.  After the initial robocopy the magnitude of this issue can be quite reduced by using the /MAXAGE parameter to limit the files that it copies to the last few days.