Today i setup wordpress with nginx. The problem, i cant figure out, is that some pages will be served as expected and some other will be served as download. If i download them, the content is the sama as index.php in wordpress root folder. I tried to find some differences between the “normal” pages and the other defective pages, but cant find anyone.
The funny thing is, if i give that faulty pages another permalink (e.g. ../funktionen -> ../features, the pages will be served correctly. If switch back to old old permalink, the page will be served as download.
Here is my nginx conf:
#redirect
# http://www.termin2go.com and
# http://termin2go.com
# to
# https://www.termin2go.com
server {
listen 80;
server_name termin2go.com www.termin2go.com;
rewrite ^(.*) https://www.termin2go.com$1 permanent;
}
# redirect
# https://termin2go.com
# to
# https://www.termin2go.com
server {
listen 443 ssl;
server_name termin2go.com;
rewrite ^(.*) https://www.termin2go.com$1 permanent;
}
server {
listen 443 ssl;
server_name www.termin2go.com;
ssl_certificate /etc/ssl/private/www.termin2go.com.crt;
ssl_certificate_key /etc/ssl/private/www.termin2go.com.key;
access_log /var/log/nginx/termin2go.com.log;
error_log /var/log/nginx/termin2go.com_error.log;
set $root_path '/var/www/wordpress';
root $root_path;
charset utf-8;
client_max_body_size 7M;
# DEFAULT INDEX
index index.php;
# configure prerender for snapshot generation
location / {
try_files $uri $uri/ /index.php?$args @prerender;
}
location ~ /(.|wp-config.php|liesmich.html|readme.html) {
return 444;
}
# REWRITES
location ~ ^/(d+)/$ {
return 301 /?p=$1;
}
# fix whitespace in url bug
if ($request_uri ~ " ") { return 444; }
# redirect google bot to prerender for ajax content (AngularJS)
location @prerender {
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
#proxy_intercept_errors on;
set $prerender 0;
if ($http_user_agent ~* "googlebot|yahoo|bingbot|baiduspider|yandex|yeti|yodaobot|gigabot|ia_archiver|facebookexternalhit|twitterbot|developers.google.com") {
set $prerender 1;
}
if ($args ~ "_escaped_fragment_|prerender=1") {
set $prerender 1;
}
if ($http_user_agent ~ "Prerender") {
set $prerender 0;
}
if ($prerender = 1) {
rewrite .* /$scheme://$host$request_uri break;
proxy_pass http://127.0.0.1:4000;
}
if ($prerender = 0) {
rewrite .* /index.php break;
}
}
# cache static files one month
location ~* .(css|cur|js|jpe?g|gif|htc|ico|png|html|xml|otf|ttf|eot|woff|svg)$ {
expires 31d;
add_header Pragma "public";
add_header Cache-Control "public, must-revalidate, proxy-revalidate";
}
# cache static files 30 days
location ~* .(css|cur|js|jpe?g|gif|htc|ico|png|html|xml|otf|ttf|eot|woff|svg)$ {
expires 31d;
add_header Pragma "public";
add_header Cache-Control "public, must-revalidate, proxy-revalidate";
}
# pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
#
location ~ .php$ {
try_files $uri =404;
fastcgi_split_path_info ^(.+.php)(/.+)$;
#NOTE: You should have "cgi.fix_pathinfo = 0;" in php.ini
include fastcgi_params;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
#fastcgi_intercept_errors on;
fastcgi_pass unix:/var/run/php5-fpm.sock;
}
# prevent nginx from serving dotfiles (.htaccess, .svn, .git, etc.)
location ~ /. {
deny all;
access_log off;
log_not_found off;
}
}
In my case the fix was:
Happened to me when our host had force updated PHP version. We had to replace a number of obsolete PHP functions.