Are most third party packages in Python safe to use with Django and Nginx?











up vote
0
down vote

favorite












So I have been developing an app using the Django Platform, and recently I have been thinking about scalability in the app. I currently am developing this app in a civil engineering research lab, and we don't get a ton of traffic on our current server.



I was wondering if some of the apps we develop are able to take multiple requests at the same time in our labs. From my understanding, they should, based on how well the server we are using is configured (Nginx in our case, I believe). From what I understand (please correct me if I wrong) our server has some workers (or threads, I think they are the same thing) that will accept requests, then they will go through the code that I have written and generate responses based on what the request is.



My main question is if I am using third party packages, how well will they scale if my app was getting more traffic? For example, if I used pandas/numpy/numba/etc... in my controllers, would this be safe when we have multiple users sending requests to multiple threads?



EDIT: I changed the wording of the question as it was worded a bit poorly initially.










share|improve this question




















  • 1




    In Django use signals and threads the regular way, along with message queues such as Celery or RQ. Ship that logic out of your views into callbacks and signals, where make the processing multithreaded.
    – dmitryro
    Nov 10 at 19:31








  • 1




    nginx workers are typically processes, not threads.
    – Daniel Roseman
    Nov 10 at 19:35










  • @DanielRoseman So for every user that goes onto my app, a different process will be created through Nginx?
    – Wade
    Nov 10 at 19:40






  • 1




    No, it uses a fixed pool of workers.
    – Daniel Roseman
    Nov 10 at 19:43








  • 1




    Yes, both of those are correct. As dmitryro said though, if you have tasks that take a long time you should probably extract them into asynchronous jobs using something like Celery, so that the requests can complete quickly and don't get blocked.
    – Daniel Roseman
    Nov 10 at 19:55

















up vote
0
down vote

favorite












So I have been developing an app using the Django Platform, and recently I have been thinking about scalability in the app. I currently am developing this app in a civil engineering research lab, and we don't get a ton of traffic on our current server.



I was wondering if some of the apps we develop are able to take multiple requests at the same time in our labs. From my understanding, they should, based on how well the server we are using is configured (Nginx in our case, I believe). From what I understand (please correct me if I wrong) our server has some workers (or threads, I think they are the same thing) that will accept requests, then they will go through the code that I have written and generate responses based on what the request is.



My main question is if I am using third party packages, how well will they scale if my app was getting more traffic? For example, if I used pandas/numpy/numba/etc... in my controllers, would this be safe when we have multiple users sending requests to multiple threads?



EDIT: I changed the wording of the question as it was worded a bit poorly initially.










share|improve this question




















  • 1




    In Django use signals and threads the regular way, along with message queues such as Celery or RQ. Ship that logic out of your views into callbacks and signals, where make the processing multithreaded.
    – dmitryro
    Nov 10 at 19:31








  • 1




    nginx workers are typically processes, not threads.
    – Daniel Roseman
    Nov 10 at 19:35










  • @DanielRoseman So for every user that goes onto my app, a different process will be created through Nginx?
    – Wade
    Nov 10 at 19:40






  • 1




    No, it uses a fixed pool of workers.
    – Daniel Roseman
    Nov 10 at 19:43








  • 1




    Yes, both of those are correct. As dmitryro said though, if you have tasks that take a long time you should probably extract them into asynchronous jobs using something like Celery, so that the requests can complete quickly and don't get blocked.
    – Daniel Roseman
    Nov 10 at 19:55















up vote
0
down vote

favorite









up vote
0
down vote

favorite











So I have been developing an app using the Django Platform, and recently I have been thinking about scalability in the app. I currently am developing this app in a civil engineering research lab, and we don't get a ton of traffic on our current server.



I was wondering if some of the apps we develop are able to take multiple requests at the same time in our labs. From my understanding, they should, based on how well the server we are using is configured (Nginx in our case, I believe). From what I understand (please correct me if I wrong) our server has some workers (or threads, I think they are the same thing) that will accept requests, then they will go through the code that I have written and generate responses based on what the request is.



My main question is if I am using third party packages, how well will they scale if my app was getting more traffic? For example, if I used pandas/numpy/numba/etc... in my controllers, would this be safe when we have multiple users sending requests to multiple threads?



EDIT: I changed the wording of the question as it was worded a bit poorly initially.










share|improve this question















So I have been developing an app using the Django Platform, and recently I have been thinking about scalability in the app. I currently am developing this app in a civil engineering research lab, and we don't get a ton of traffic on our current server.



I was wondering if some of the apps we develop are able to take multiple requests at the same time in our labs. From my understanding, they should, based on how well the server we are using is configured (Nginx in our case, I believe). From what I understand (please correct me if I wrong) our server has some workers (or threads, I think they are the same thing) that will accept requests, then they will go through the code that I have written and generate responses based on what the request is.



My main question is if I am using third party packages, how well will they scale if my app was getting more traffic? For example, if I used pandas/numpy/numba/etc... in my controllers, would this be safe when we have multiple users sending requests to multiple threads?



EDIT: I changed the wording of the question as it was worded a bit poorly initially.







python django nginx thread-safety






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 10 at 19:42

























asked Nov 10 at 19:27









Wade

355110




355110








  • 1




    In Django use signals and threads the regular way, along with message queues such as Celery or RQ. Ship that logic out of your views into callbacks and signals, where make the processing multithreaded.
    – dmitryro
    Nov 10 at 19:31








  • 1




    nginx workers are typically processes, not threads.
    – Daniel Roseman
    Nov 10 at 19:35










  • @DanielRoseman So for every user that goes onto my app, a different process will be created through Nginx?
    – Wade
    Nov 10 at 19:40






  • 1




    No, it uses a fixed pool of workers.
    – Daniel Roseman
    Nov 10 at 19:43








  • 1




    Yes, both of those are correct. As dmitryro said though, if you have tasks that take a long time you should probably extract them into asynchronous jobs using something like Celery, so that the requests can complete quickly and don't get blocked.
    – Daniel Roseman
    Nov 10 at 19:55
















  • 1




    In Django use signals and threads the regular way, along with message queues such as Celery or RQ. Ship that logic out of your views into callbacks and signals, where make the processing multithreaded.
    – dmitryro
    Nov 10 at 19:31








  • 1




    nginx workers are typically processes, not threads.
    – Daniel Roseman
    Nov 10 at 19:35










  • @DanielRoseman So for every user that goes onto my app, a different process will be created through Nginx?
    – Wade
    Nov 10 at 19:40






  • 1




    No, it uses a fixed pool of workers.
    – Daniel Roseman
    Nov 10 at 19:43








  • 1




    Yes, both of those are correct. As dmitryro said though, if you have tasks that take a long time you should probably extract them into asynchronous jobs using something like Celery, so that the requests can complete quickly and don't get blocked.
    – Daniel Roseman
    Nov 10 at 19:55










1




1




In Django use signals and threads the regular way, along with message queues such as Celery or RQ. Ship that logic out of your views into callbacks and signals, where make the processing multithreaded.
– dmitryro
Nov 10 at 19:31






In Django use signals and threads the regular way, along with message queues such as Celery or RQ. Ship that logic out of your views into callbacks and signals, where make the processing multithreaded.
– dmitryro
Nov 10 at 19:31






1




1




nginx workers are typically processes, not threads.
– Daniel Roseman
Nov 10 at 19:35




nginx workers are typically processes, not threads.
– Daniel Roseman
Nov 10 at 19:35












@DanielRoseman So for every user that goes onto my app, a different process will be created through Nginx?
– Wade
Nov 10 at 19:40




@DanielRoseman So for every user that goes onto my app, a different process will be created through Nginx?
– Wade
Nov 10 at 19:40




1




1




No, it uses a fixed pool of workers.
– Daniel Roseman
Nov 10 at 19:43






No, it uses a fixed pool of workers.
– Daniel Roseman
Nov 10 at 19:43






1




1




Yes, both of those are correct. As dmitryro said though, if you have tasks that take a long time you should probably extract them into asynchronous jobs using something like Celery, so that the requests can complete quickly and don't get blocked.
– Daniel Roseman
Nov 10 at 19:55






Yes, both of those are correct. As dmitryro said though, if you have tasks that take a long time you should probably extract them into asynchronous jobs using something like Celery, so that the requests can complete quickly and don't get blocked.
– Daniel Roseman
Nov 10 at 19:55



















active

oldest

votes











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














 

draft saved


draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53242633%2fare-most-third-party-packages-in-python-safe-to-use-with-django-and-nginx%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown






























active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes
















 

draft saved


draft discarded



















































 


draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53242633%2fare-most-third-party-packages-in-python-safe-to-use-with-django-and-nginx%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Florida Star v. B. J. F.

Error while running script in elastic search , gateway timeout

Adding quotations to stringified JSON object values