We want to let you know about three upcoming changes in MODX Cloud, which should roll out starting in October, and be completed by the end of 2017.
Deprecating PHP 5.6.x
First, PHP will default to the latest stable version moving forward, currently 7.1.x, when you create new Clouds. By leveraging this version, platform reliability and performance will see noticeable improvements. (You can still choose to use the earlier version.)
Along with this, we’re formally deprecating the PHP version in conjunction with the PHP release cycle. This means that when PHP 5.6 is no longer maintained (December 31, 2018), it will be removed from MODX Cloud; any sites still running on PHP will then be moved to the earliest maintained PHP version that is in MODX Cloud (currently PHP 7.1).
We strongly suggest testing any custom code against PHP 7.1, now. Most sites running current versions of MODX Revolution, WordPress and other PHP software should work without issue.
To see which version of PHP your site is running, login to the MODX Cloud Dashboard, click on a MODX Cloud instance, and visit the Web Server Tab. The first option there is a toggle for “Use PHP 7.1”—click it to enjoy a faster website that can handle a lot more traffic.
New InnoDB Support and Transition
In addition to defaulting to modern versions of PHP moving forward, we will also transition to InnoDB from the current MyISAM table type default, for all new Clouds.
InnoDB is more robust and better for applications like MODX Revolution. You should see improvements in platform IO, meaning your sites should perform better and faster, and survive hiccups better, too. MODX Revolution 2.6 will default to InnoDB for new installations in MODX Cloud and on other servers that support it, and InnoDB has worked well in MODX Cloud for years.
Further, we plan to transition all databases to InnoDB in the future to bring additional performance and resilience to MODX Cloud.
You can get a head start by switching to InnoDB today following this tutorial on how to convert a database from MyISAM to InnoDB using phpMyAdmin.
Saying Goodbye to Robots.txt (in the Cloud Dashboard)
In October, we will remove the toggle that allowed/disabled search indexing (via a default robots.txt file) for sites, in the MODX Cloud Dashboard. If you don't want search engines to index your site, you will need to upload a file named “robots.txt” with the following content into the web root of your instance(s):
User-agent: *
Disallow: /
In addition, any sites that run on internal domains that end in “.modxcloud.com” will serve a robots.txt file using the above directives; no internal URLs will be indexed by the major search engines. This will help prevent oversights that happened when development or staging sites running on internal domains were accidentally indexed by search engines.
Most sites that respond to multiple URLs, like alternate/old domains or .com/net/org versions are easily accommodated with an NGINX web rule to redirect visitors to your desired URL, keeping Google happy and preventing duplicate content penalties in search engines:
if ($host != "www.my-domain.com") {
return 301 https://www.my-domain.com$request_uri;
}
If you run multi-context sites that have a mix of private sites and publicly indexed sites in a single installation, please reach out to us for help on creating NGINX web rules, or review the following answer on how to do this at ServerFault.
The new robots.txt behavior will function exactly like any other web server does today—if a robots.txt file exists in your web root, the search engines will crawl your site per its directives. If there isn’t one, they’ll follow every link on your site and index it accordingly.
To learn more about robots.txt and controlling search engine behavior, visit https://www.robotstxt.org/.
Just the Beginning
We've been working diligently behind the scenes to bring you some exciting new capabilities in MODX Cloud very soon. We can't wait to share them with you.