The security check fails, but security files have been generated.
./console core:create-security-files
No errors
We found that the above URLs are accessible via the browser, but they should NOT be. Allowing them to be accessed can pose a potential security risk since the contents can provide information about your server and potentially your users. Please restrict access to them.
We also found that Matomo's config directory is publicly accessible. While attackers can't read the config now, if your webserver stops executing PHP files for some reason, your MySQL credentials and other information will be available to anyone. Please check your webserver config and deny access to this directory.
Fix bug.
None
If I run ./console diagnostics:run I get "Unable to test if mod_pagespeed is enabled: the request to http://unknown/./console?module" what seems to be a known bug since 2017. No idea how to solve this.
@alexhass the mod_pagespeed log can be ignored in this case. You might just want to manually check if mod_pagespeed is enabled or not.
If I understand correctly then some of the URLs are accessible via the browser, but they should NOT be. Which web browser are you using?
See also https://matomo.org/faq/troubleshooting/how-do-i-fix-the-error-private-directories-are-accessible/
I habe an apache2 machine on debian9. I used the linked article to fix the issue, but it seems not working. Htaccess is allowed to change all settings. I guess your permission files are not working well.
Hi @alexhass we haven't had any problems there otherwise in the past and it seems to work in general. Can you check
.htaccess
files were actually created? Like does eg config/.htaccess
, plugins/.htaccess
and tmp/.htaccess
exist? mod_auth
is not enabled.Same here. I use Matomo within Docker and the .htaccess
files have been created. But still the security check fails.
If I try to access these paths with my browser I get an 403.
Why does the security check doesn't recognize the 403 status code?
Never mind.
My problem is that the System-Check tries https://mydomain.com/config/config.ini.php
instead of https://mydomain.com/matomo/config/config.ini.php
(Issue #17945).
Thanks for identifying the root cause. My install is also in a subdir. So I guess it is just a false alarm for now.
Thanks for confirming @alexhass
There must be something wrong with the method for the URL maybe as I'm just seeing it's actually supposed to include the path, but it doesn't seem to be the case in https://github.com/matomo-org/matomo/blob/4.6.0-b3/core/SettingsPiwik.php#L193
I guess once https://github.com/matomo-org/matomo/issues/18306 is fixed, this issue might be fixed too. Or the other way around.
I unsuccessfully tried to recreate this issue using a fresh install of Matomo 4.5.0 in a subdirectory with Apache2.4 on Debian 10. The config directory .htaccess file is read and any browser attempts to access /matomo/config/config.ini.php are blocked with a 403 error. The system check calls the getCurrentUrlWIthoutFilename() method and it returns the URL with the correct subdirectory each time, therefore the publicly accessible directory check passes successfully.
I did notice that if PHP-FPM is used then the .htaccess file will be ignored as .php files will be passed to the proxy and not handled by Apache. This did result in the publicly accessible directory system check failing.
@bx80 could you maybe check the code if you find any reason why this wouldn't work? Or ask the two reporters some questions about their set up that might help why this isn't working?
@alexhass @54mu3l are you running the system check through the console or the UI?
Is there anything special about your setup? Maybe there's a proxy involved or something?
I'm running the system check through the UI.
Here a some more information about my setup:
I'm running everything in Docker.
For Matomo I'm using the official Matomo image.
A separate nginx image works as a proxy:
location /matomo/ {
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Host $host;
proxy_set_header X-Forwarded-Uri /matomo/;
proxy_pass http://172.28.1.10/;
proxy_read_timeout 90;
}
config.ini.php
[General]
force_ssl = 1
assume_secure_protocol = 1
proxy_client_headers[] = "HTTP_X_FORWARDED_FOR"
proxy_host_headers[] = "HTTP_X_FORWARDED_HOST"
salt = "xxxxxxx"
trusted_hosts[] = "myhost.com"
I hope this helps.
Thanks @54mu3l that is very helpful, I didn't configure a reverse proxy when trying to recreate this issue, so it could be that it's a combination of installing in a directory and using a reverse proxy that causes the problem. I'll take a look at issue #17945 first as it might offer insight into this issue.
@alexhass Thanks for the extra information, it sounds like the issue you are experiencing could be caused by PHP-FPM bypassing the .htaccess rules that are meant to prevent browser access to the /config directory. I use PHP-FPM too and have the same failed check.
When Apache proxies PHP files to PHP-FPM the .htaccess file rules are ignored, so the expected 403 access denied
response is not returned and PHP-FPM will execute the config.ini.php file which returns a blank output. The system 'required private directories' check is expecting to receive the 403 access denied
response and when it doesn't the check fails.
One workaround is to add an additional proxy pass rule to your Apache config to prevent access to the config directory, this can be done by adding ProxyPass /config !
just above the ProxyPassMatch line. eg:
<IfModule mod_proxy_fcgi.c>
ProxyPass /config !
ProxyPassMatch "^/(..php(/.)?)$" "unix:/run/php/php7.4-fpm.sock|fcgi://localhost/var/www/matomo"
</IfModule>
It would be better we could handle this without the need to manually add extra rules to the web server config.
@tsteur Do you think it is worth enhancing the 'required private directories' system check to detect if PHP-FPM is being used and then either...
a) allow the check to pass as long as a blank output is being returned (is this safe? The config.ini.php file is set to exit before output if executed)
or
b) show some instructions on how to setup a ProxyPass rule / FAQ link?
We would need to adjust https://matomo.org/faq/troubleshooting/how-do-i-fix-the-error-private-directories-are-accessible/ and mention php-fpm.
I suppose we could try to detect php-fpm (if it's easy to do, but note that people also execute the diagnostic check on the command line so this might not be really possible). Not sure how easy it is to detect this? I think it might be good enough to mention in above FAQ to adjust php-fpm maybe.
@Findus23 any thoughts?
BTW could you reproduce so far the error that it is sending the request to the wrong URL mentioned in https://github.com/matomo-org/matomo/issues/18132#issuecomment-966908570 ?
I was mostly keen on fixing this issue so we are requesting the right URL.
Honestly I’m confused… my drupal installs have not shown any issues and I think the rules in apache all work and protect all files properly. I need to check that to compare to matomo.
@alexhass Sorry for the confusion, I'm sure your Apache rules are fine. In this case Matomo is expecting one directory to be protected in a particular way, php-fpm changes this and causes the Matomo system check to fail.
You can check if the config directory is protected by trying to load yourdomain.com/config/config.ini.php
in the browser.
If you get a 403 access denied
message then the config directory is properly protected and the system check should pass.
If you get a blank page then it is likely that the access rules are not being applied because of php-fpm and the system check will fail, however no information should be exposed due to a secondary check in the config file itself. You could either ignore the system check or try adding the ProxyPass rule to protect the config directory in the expected way.
I hope that's some help.
@tsteur It's fairly easy to detect if php-fpm is running for the current script with something likeecho (php_sapi_name() === 'fpm-fcgi' ? 'running fpm' : 'not running fpm');
but no easy way to detect if php-fpm is running on the web server if the system check is run from the console. Maybe we could just detect php-fpm if not running from the command line and then show a link to some more information? I'm guessing most people use the UI to run the check?
There seem to be two separate issues here, php-fpm bypassing .htaccess rules causing the private directories check to fail (this issue) and requesting the wrong URL with reverse proxies (addressed in #17945 - I managed to recreate that issue and have a suggested fix detailed there)
I'm not familiar at all with how apache uses php-fpm, but I always assumed that it would work the same way as a regular nginx with php-fpm setup where all static files are served by the webserver and only requests to .php files (and in the case of Matomo-nginx only a subset of those) is forwarded to php-fpm. So I always assumed that php-fpm had no influence on .htaccess files therefore.
But if this is the case, then we could detect php-fpm and fix a part of #13589 at the same time.
@Findus32 That's correct, Apache only serves the static files and typically the proxy rules will send anything *.php to php-fpm. It's only when serving static files that Apache will check the .htaccess rules, but for anything passed to a proxy there is no check done by either Apache or php-fpm. So while php-fpm doesn't influence the .htaccess files themselves it does mean that they will be ignored when requesting .php files. (eg. requesting a non-php file from the config directory will return a 403, but a php file will be executed)
The linked Matomo-nginx config with rules to exclude proxying php files from certain sensitive locations is the nginx equivalent of the Apache ProxyPass rule I mentioned above. Seems like it would be good to have both scenarios detected and provide information on recommended server configurations.
fyi: I just updated my system to 4.7.1 and I still get the same error!
The system check page still shows the following urls (without the sub directory):
https://mydomain.com/config/config.ini.php
https://mydomain.com/tmp/cache/tracker/matomocache_general.php
https://mydomain.com/tmp/
https://mydomain.com/tmp/empty
https://mydomain.com/lang/en.json
Instead I would expect something like this:
https://mydomain.com/matomo/config/config.ini.php
https://mydomain.com/matomo/tmp/cache/tracker/matomocache_general.php
https://mydomain.com/matomo/tmp/
https://mydomain.com/matomo/tmp/empty
https://mydomain.com/matomo/ang/en.json
Same here with PHP-FPM.
Aside of this Debian 9 has no ProxyPassMatch rule set. The config looks like this only:
# Redirect to local php-fpm if mod_php is not available
<IfModule !mod_php7.c>
<IfModule proxy_fcgi_module>
# Enable http authorization headers
<IfModule setenvif_module>
SetEnvIfNoCase ^Authorization$ "(.+)" HTTP_AUTHORIZATION=$1
</IfModule>
<FilesMatch ".+\.ph(ar|p|tml)$">
SetHandler "proxy:unix:/run/php/php7.3-fpm.sock|fcgi://localhost"
</FilesMatch>
# The default configuration works for most of the installation, however it could
# be improved in various ways. One simple improvement is to not pass files that
# doesn't exist to the handler as shown below, for more configuration examples
# see https://wiki.apache.org/httpd/PHP-FPM
# <FilesMatch ".+\.ph(ar|p|tml)$">
# <If "-f %{REQUEST_FILENAME}">
# SetHandler "proxy:unix:/run/php/php7.3-fpm.sock|fcgi://localhost"
# </If>
# </FilesMatch>
<FilesMatch ".+\.phps$">
# Deny access to raw php sources by default
# To re-enable it's recommended to enable access to the files
# only in specific virtual host or directory
Require all denied
</FilesMatch>
# Deny access to files without filename (e.g. '.php')
<FilesMatch "^\.ph(ar|p|ps|tml)$">
Require all denied
</FilesMatch>
</IfModule>
</IfModule>
Hi @alexhass,
If your Apache config is using FilesMatch
/ SetHandler
instead of ProxyPass then you could use a FilesMatch
rule to deny access to the private directories with something like:
<FilesMatch "^/(config|tmp|core|lang)$">
Require all denied
</FilesMatch>
Hi @54mu3l,
Could you try adding proxy_uri_header = 1
to the [General]
section of your config/config.ini.php
file and see if that fixes the system check file paths?
Ok it works now!
I just needed to add proxy_uri_header = 1
to the [General]
section of the config/config.ini.php
file.
Thanks everyone!
Hi @alexhass, If your Apache config is using
FilesMatch
/SetHandler
instead of ProxyPass then you could use aFilesMatch
rule to deny access to the private directories with something like:<FilesMatch "^/(config|tmp|core|lang)$"> Require all denied </FilesMatch>
That is too general. That could affect other applications on the server, too.
Hi @alexhass, If your Apache config is using
FilesMatch
/SetHandler
instead of ProxyPass then you could use aFilesMatch
rule to deny access to the private directories with something like:<FilesMatch "^/(config|tmp|core|lang)$"> Require all denied </FilesMatch>
That is too general. That could affect other applications on the server, too.
Yes, that's just a generic example. If there are other applications being hosted under the same configuration then the rule would need to be adjusted to take that into account. :slightly_smiling_face:
Don't use FilesMatch
for URL's! Try this:
<LocationMatch "^/+(config|tmp|core|lang)(/.*)?$">
Require all denied
</LocationMatch>