Is it possible to make website always online?
Well,I wanna make my website always online.
One method:Is it possible to cache a whole website using nginx proxy_pass for a login time?and how?
Actually i mean if the backend is down or not working,we can still use the cache for people to visit my website.
Another one method:crawler?
OR snapshot tech like search engin cached?
nginx caching proxypass
add a comment |
Well,I wanna make my website always online.
One method:Is it possible to cache a whole website using nginx proxy_pass for a login time?and how?
Actually i mean if the backend is down or not working,we can still use the cache for people to visit my website.
Another one method:crawler?
OR snapshot tech like search engin cached?
nginx caching proxypass
add a comment |
Well,I wanna make my website always online.
One method:Is it possible to cache a whole website using nginx proxy_pass for a login time?and how?
Actually i mean if the backend is down or not working,we can still use the cache for people to visit my website.
Another one method:crawler?
OR snapshot tech like search engin cached?
nginx caching proxypass
Well,I wanna make my website always online.
One method:Is it possible to cache a whole website using nginx proxy_pass for a login time?and how?
Actually i mean if the backend is down or not working,we can still use the cache for people to visit my website.
Another one method:crawler?
OR snapshot tech like search engin cached?
nginx caching proxypass
nginx caching proxypass
asked Nov 15 '18 at 3:18
Peter LeePeter Lee
132
132
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
It is not actually possible for proxy_pass to cache websites (unless you have an actual application that has a cache of the website on the other end) as all it does is pass requests to another endpoint. Crawlers usually refer to search engine bots that look through a website for links so they can index all information on the website for the search engine hosting the crawler.
"Snapshots tech" works and are usually created by CDNs such as Cloudflare/Akamai, and is probably what you are looking for. CDNs are also used for many other things but I assume you are most interested in being able to show a copy of your website if it occasionally goes offline.
There is also another option that is setting browser caching headers in NGINX, that instructs the user's browsers to show a cached copy of your website and not update it until the cache expires. However, the downside is that your users will not be able to see the live copy of your website even if your website is online and that they would have to have visited your web page within the cache time to have cached it.
An example:
location ~* .(?:js|css|html)$ {
expires 1d; #users' browsers cache it for a day
add_header Pragma public;
add_header Cache-Control "public";
}
I only cache for one day,so if there is any update on my website.the other day will be update
– Peter Lee
Nov 16 '18 at 2:48
Yes that works but users will have to have visited that webpage within the last day too to have it in cache.
– Orphamiel
Nov 16 '18 at 4:16
yes that's no problem
– Peter Lee
Nov 16 '18 at 7:52
proxy_cache_key $host$uri$is_args$args; proxy_cache_min_uses 1; proxy_cache_valid 200 720m; proxy_cache_valid 500 502 503 1m; proxy_cache_valid 302 5m; proxy_cache_valid any 5m; proxy_hide_header X-Powered-By; proxy_hide_header Vary; proxy_hide_header Transfer-Encoding;
– Peter Lee
Nov 16 '18 at 8:25
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53311903%2fis-it-possible-to-make-website-always-online%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
It is not actually possible for proxy_pass to cache websites (unless you have an actual application that has a cache of the website on the other end) as all it does is pass requests to another endpoint. Crawlers usually refer to search engine bots that look through a website for links so they can index all information on the website for the search engine hosting the crawler.
"Snapshots tech" works and are usually created by CDNs such as Cloudflare/Akamai, and is probably what you are looking for. CDNs are also used for many other things but I assume you are most interested in being able to show a copy of your website if it occasionally goes offline.
There is also another option that is setting browser caching headers in NGINX, that instructs the user's browsers to show a cached copy of your website and not update it until the cache expires. However, the downside is that your users will not be able to see the live copy of your website even if your website is online and that they would have to have visited your web page within the cache time to have cached it.
An example:
location ~* .(?:js|css|html)$ {
expires 1d; #users' browsers cache it for a day
add_header Pragma public;
add_header Cache-Control "public";
}
I only cache for one day,so if there is any update on my website.the other day will be update
– Peter Lee
Nov 16 '18 at 2:48
Yes that works but users will have to have visited that webpage within the last day too to have it in cache.
– Orphamiel
Nov 16 '18 at 4:16
yes that's no problem
– Peter Lee
Nov 16 '18 at 7:52
proxy_cache_key $host$uri$is_args$args; proxy_cache_min_uses 1; proxy_cache_valid 200 720m; proxy_cache_valid 500 502 503 1m; proxy_cache_valid 302 5m; proxy_cache_valid any 5m; proxy_hide_header X-Powered-By; proxy_hide_header Vary; proxy_hide_header Transfer-Encoding;
– Peter Lee
Nov 16 '18 at 8:25
add a comment |
It is not actually possible for proxy_pass to cache websites (unless you have an actual application that has a cache of the website on the other end) as all it does is pass requests to another endpoint. Crawlers usually refer to search engine bots that look through a website for links so they can index all information on the website for the search engine hosting the crawler.
"Snapshots tech" works and are usually created by CDNs such as Cloudflare/Akamai, and is probably what you are looking for. CDNs are also used for many other things but I assume you are most interested in being able to show a copy of your website if it occasionally goes offline.
There is also another option that is setting browser caching headers in NGINX, that instructs the user's browsers to show a cached copy of your website and not update it until the cache expires. However, the downside is that your users will not be able to see the live copy of your website even if your website is online and that they would have to have visited your web page within the cache time to have cached it.
An example:
location ~* .(?:js|css|html)$ {
expires 1d; #users' browsers cache it for a day
add_header Pragma public;
add_header Cache-Control "public";
}
I only cache for one day,so if there is any update on my website.the other day will be update
– Peter Lee
Nov 16 '18 at 2:48
Yes that works but users will have to have visited that webpage within the last day too to have it in cache.
– Orphamiel
Nov 16 '18 at 4:16
yes that's no problem
– Peter Lee
Nov 16 '18 at 7:52
proxy_cache_key $host$uri$is_args$args; proxy_cache_min_uses 1; proxy_cache_valid 200 720m; proxy_cache_valid 500 502 503 1m; proxy_cache_valid 302 5m; proxy_cache_valid any 5m; proxy_hide_header X-Powered-By; proxy_hide_header Vary; proxy_hide_header Transfer-Encoding;
– Peter Lee
Nov 16 '18 at 8:25
add a comment |
It is not actually possible for proxy_pass to cache websites (unless you have an actual application that has a cache of the website on the other end) as all it does is pass requests to another endpoint. Crawlers usually refer to search engine bots that look through a website for links so they can index all information on the website for the search engine hosting the crawler.
"Snapshots tech" works and are usually created by CDNs such as Cloudflare/Akamai, and is probably what you are looking for. CDNs are also used for many other things but I assume you are most interested in being able to show a copy of your website if it occasionally goes offline.
There is also another option that is setting browser caching headers in NGINX, that instructs the user's browsers to show a cached copy of your website and not update it until the cache expires. However, the downside is that your users will not be able to see the live copy of your website even if your website is online and that they would have to have visited your web page within the cache time to have cached it.
An example:
location ~* .(?:js|css|html)$ {
expires 1d; #users' browsers cache it for a day
add_header Pragma public;
add_header Cache-Control "public";
}
It is not actually possible for proxy_pass to cache websites (unless you have an actual application that has a cache of the website on the other end) as all it does is pass requests to another endpoint. Crawlers usually refer to search engine bots that look through a website for links so they can index all information on the website for the search engine hosting the crawler.
"Snapshots tech" works and are usually created by CDNs such as Cloudflare/Akamai, and is probably what you are looking for. CDNs are also used for many other things but I assume you are most interested in being able to show a copy of your website if it occasionally goes offline.
There is also another option that is setting browser caching headers in NGINX, that instructs the user's browsers to show a cached copy of your website and not update it until the cache expires. However, the downside is that your users will not be able to see the live copy of your website even if your website is online and that they would have to have visited your web page within the cache time to have cached it.
An example:
location ~* .(?:js|css|html)$ {
expires 1d; #users' browsers cache it for a day
add_header Pragma public;
add_header Cache-Control "public";
}
edited Nov 16 '18 at 4:15
answered Nov 15 '18 at 16:09
OrphamielOrphamiel
8041121
8041121
I only cache for one day,so if there is any update on my website.the other day will be update
– Peter Lee
Nov 16 '18 at 2:48
Yes that works but users will have to have visited that webpage within the last day too to have it in cache.
– Orphamiel
Nov 16 '18 at 4:16
yes that's no problem
– Peter Lee
Nov 16 '18 at 7:52
proxy_cache_key $host$uri$is_args$args; proxy_cache_min_uses 1; proxy_cache_valid 200 720m; proxy_cache_valid 500 502 503 1m; proxy_cache_valid 302 5m; proxy_cache_valid any 5m; proxy_hide_header X-Powered-By; proxy_hide_header Vary; proxy_hide_header Transfer-Encoding;
– Peter Lee
Nov 16 '18 at 8:25
add a comment |
I only cache for one day,so if there is any update on my website.the other day will be update
– Peter Lee
Nov 16 '18 at 2:48
Yes that works but users will have to have visited that webpage within the last day too to have it in cache.
– Orphamiel
Nov 16 '18 at 4:16
yes that's no problem
– Peter Lee
Nov 16 '18 at 7:52
proxy_cache_key $host$uri$is_args$args; proxy_cache_min_uses 1; proxy_cache_valid 200 720m; proxy_cache_valid 500 502 503 1m; proxy_cache_valid 302 5m; proxy_cache_valid any 5m; proxy_hide_header X-Powered-By; proxy_hide_header Vary; proxy_hide_header Transfer-Encoding;
– Peter Lee
Nov 16 '18 at 8:25
I only cache for one day,so if there is any update on my website.the other day will be update
– Peter Lee
Nov 16 '18 at 2:48
I only cache for one day,so if there is any update on my website.the other day will be update
– Peter Lee
Nov 16 '18 at 2:48
Yes that works but users will have to have visited that webpage within the last day too to have it in cache.
– Orphamiel
Nov 16 '18 at 4:16
Yes that works but users will have to have visited that webpage within the last day too to have it in cache.
– Orphamiel
Nov 16 '18 at 4:16
yes that's no problem
– Peter Lee
Nov 16 '18 at 7:52
yes that's no problem
– Peter Lee
Nov 16 '18 at 7:52
proxy_cache_key $host$uri$is_args$args; proxy_cache_min_uses 1; proxy_cache_valid 200 720m; proxy_cache_valid 500 502 503 1m; proxy_cache_valid 302 5m; proxy_cache_valid any 5m; proxy_hide_header X-Powered-By; proxy_hide_header Vary; proxy_hide_header Transfer-Encoding;
– Peter Lee
Nov 16 '18 at 8:25
proxy_cache_key $host$uri$is_args$args; proxy_cache_min_uses 1; proxy_cache_valid 200 720m; proxy_cache_valid 500 502 503 1m; proxy_cache_valid 302 5m; proxy_cache_valid any 5m; proxy_hide_header X-Powered-By; proxy_hide_header Vary; proxy_hide_header Transfer-Encoding;
– Peter Lee
Nov 16 '18 at 8:25
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53311903%2fis-it-possible-to-make-website-always-online%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown