How to Plot graph from multiple independent variable and one dependent variable in python [Multiple linear...












0















I am new to Machine Learning and facing a situation in which
how to remove multiple independent variables in multiple linear regression.
Steps I have gone through: 1) Read Dataset 2) separate into X and Y 3)Encode the categorical data as Dataset contains column : prof rank, profession etc... 4) Remove Dummy variable 5)OLS regression results.



I had 7 independent variables, after OLS ,I have 6 independent variables.Removed by P > 0.05 as P-value is greater than 0.05 significance level.



Can you suggest what are the steps to plot the graph with removing all unnecessary independent variables as attached in the image?. How to get just ONE independent variable from all these variables.



How to check multi-collinearity using python? What is VIF and how to use it to detect multi-collinearity



Thanks in advance.
Sorry for grammmer mistakes if any.



OLS Regression Results Summary










share|improve this question

























  • do you understand p value?

    – Dejan Marić
    Nov 13 '18 at 13:36











  • Welcome to SO; please see why an image of your code is not helpful

    – desertnaut
    Nov 13 '18 at 13:40











  • @DejanMarić, yes I guess its the predictor value that needs to be above Significance level of 0.05. From this, we can determine that our assumption of null hypothesis is True or false.

    – ojas mehta
    Nov 13 '18 at 14:22
















0















I am new to Machine Learning and facing a situation in which
how to remove multiple independent variables in multiple linear regression.
Steps I have gone through: 1) Read Dataset 2) separate into X and Y 3)Encode the categorical data as Dataset contains column : prof rank, profession etc... 4) Remove Dummy variable 5)OLS regression results.



I had 7 independent variables, after OLS ,I have 6 independent variables.Removed by P > 0.05 as P-value is greater than 0.05 significance level.



Can you suggest what are the steps to plot the graph with removing all unnecessary independent variables as attached in the image?. How to get just ONE independent variable from all these variables.



How to check multi-collinearity using python? What is VIF and how to use it to detect multi-collinearity



Thanks in advance.
Sorry for grammmer mistakes if any.



OLS Regression Results Summary










share|improve this question

























  • do you understand p value?

    – Dejan Marić
    Nov 13 '18 at 13:36











  • Welcome to SO; please see why an image of your code is not helpful

    – desertnaut
    Nov 13 '18 at 13:40











  • @DejanMarić, yes I guess its the predictor value that needs to be above Significance level of 0.05. From this, we can determine that our assumption of null hypothesis is True or false.

    – ojas mehta
    Nov 13 '18 at 14:22














0












0








0








I am new to Machine Learning and facing a situation in which
how to remove multiple independent variables in multiple linear regression.
Steps I have gone through: 1) Read Dataset 2) separate into X and Y 3)Encode the categorical data as Dataset contains column : prof rank, profession etc... 4) Remove Dummy variable 5)OLS regression results.



I had 7 independent variables, after OLS ,I have 6 independent variables.Removed by P > 0.05 as P-value is greater than 0.05 significance level.



Can you suggest what are the steps to plot the graph with removing all unnecessary independent variables as attached in the image?. How to get just ONE independent variable from all these variables.



How to check multi-collinearity using python? What is VIF and how to use it to detect multi-collinearity



Thanks in advance.
Sorry for grammmer mistakes if any.



OLS Regression Results Summary










share|improve this question
















I am new to Machine Learning and facing a situation in which
how to remove multiple independent variables in multiple linear regression.
Steps I have gone through: 1) Read Dataset 2) separate into X and Y 3)Encode the categorical data as Dataset contains column : prof rank, profession etc... 4) Remove Dummy variable 5)OLS regression results.



I had 7 independent variables, after OLS ,I have 6 independent variables.Removed by P > 0.05 as P-value is greater than 0.05 significance level.



Can you suggest what are the steps to plot the graph with removing all unnecessary independent variables as attached in the image?. How to get just ONE independent variable from all these variables.



How to check multi-collinearity using python? What is VIF and how to use it to detect multi-collinearity



Thanks in advance.
Sorry for grammmer mistakes if any.



OLS Regression Results Summary







python machine-learning plot linear-regression






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 13 '18 at 14:30







ojas mehta

















asked Nov 13 '18 at 13:24









ojas mehtaojas mehta

61




61













  • do you understand p value?

    – Dejan Marić
    Nov 13 '18 at 13:36











  • Welcome to SO; please see why an image of your code is not helpful

    – desertnaut
    Nov 13 '18 at 13:40











  • @DejanMarić, yes I guess its the predictor value that needs to be above Significance level of 0.05. From this, we can determine that our assumption of null hypothesis is True or false.

    – ojas mehta
    Nov 13 '18 at 14:22



















  • do you understand p value?

    – Dejan Marić
    Nov 13 '18 at 13:36











  • Welcome to SO; please see why an image of your code is not helpful

    – desertnaut
    Nov 13 '18 at 13:40











  • @DejanMarić, yes I guess its the predictor value that needs to be above Significance level of 0.05. From this, we can determine that our assumption of null hypothesis is True or false.

    – ojas mehta
    Nov 13 '18 at 14:22

















do you understand p value?

– Dejan Marić
Nov 13 '18 at 13:36





do you understand p value?

– Dejan Marić
Nov 13 '18 at 13:36













Welcome to SO; please see why an image of your code is not helpful

– desertnaut
Nov 13 '18 at 13:40





Welcome to SO; please see why an image of your code is not helpful

– desertnaut
Nov 13 '18 at 13:40













@DejanMarić, yes I guess its the predictor value that needs to be above Significance level of 0.05. From this, we can determine that our assumption of null hypothesis is True or false.

– ojas mehta
Nov 13 '18 at 14:22





@DejanMarić, yes I guess its the predictor value that needs to be above Significance level of 0.05. From this, we can determine that our assumption of null hypothesis is True or false.

– ojas mehta
Nov 13 '18 at 14:22












1 Answer
1






active

oldest

votes


















2














It's rather difficult to visualise multidimensional linear relationship. This post shared some common ways to visualise it.



Multicollinearity is a big problem for regression and this causes weird coefficients in your betas. VIF is one of the tools used to detect this. Generally the closer VIF to 1 the better.



If you have multicollinearity, you might want to proceed with one of the following options:




  • Thowing out correlated variables

  • Transforming correlated variables: Principal Component Analysis to determine latent data structure or use Partial Least Squares






share|improve this answer























    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53282030%2fhow-to-plot-graph-from-multiple-independent-variable-and-one-dependent-variable%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    2














    It's rather difficult to visualise multidimensional linear relationship. This post shared some common ways to visualise it.



    Multicollinearity is a big problem for regression and this causes weird coefficients in your betas. VIF is one of the tools used to detect this. Generally the closer VIF to 1 the better.



    If you have multicollinearity, you might want to proceed with one of the following options:




    • Thowing out correlated variables

    • Transforming correlated variables: Principal Component Analysis to determine latent data structure or use Partial Least Squares






    share|improve this answer




























      2














      It's rather difficult to visualise multidimensional linear relationship. This post shared some common ways to visualise it.



      Multicollinearity is a big problem for regression and this causes weird coefficients in your betas. VIF is one of the tools used to detect this. Generally the closer VIF to 1 the better.



      If you have multicollinearity, you might want to proceed with one of the following options:




      • Thowing out correlated variables

      • Transforming correlated variables: Principal Component Analysis to determine latent data structure or use Partial Least Squares






      share|improve this answer


























        2












        2








        2







        It's rather difficult to visualise multidimensional linear relationship. This post shared some common ways to visualise it.



        Multicollinearity is a big problem for regression and this causes weird coefficients in your betas. VIF is one of the tools used to detect this. Generally the closer VIF to 1 the better.



        If you have multicollinearity, you might want to proceed with one of the following options:




        • Thowing out correlated variables

        • Transforming correlated variables: Principal Component Analysis to determine latent data structure or use Partial Least Squares






        share|improve this answer













        It's rather difficult to visualise multidimensional linear relationship. This post shared some common ways to visualise it.



        Multicollinearity is a big problem for regression and this causes weird coefficients in your betas. VIF is one of the tools used to detect this. Generally the closer VIF to 1 the better.



        If you have multicollinearity, you might want to proceed with one of the following options:




        • Thowing out correlated variables

        • Transforming correlated variables: Principal Component Analysis to determine latent data structure or use Partial Least Squares







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 13 '18 at 13:41









        Ethan NguyenEthan Nguyen

        465




        465






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53282030%2fhow-to-plot-graph-from-multiple-independent-variable-and-one-dependent-variable%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Florida Star v. B. J. F.

            Danny Elfman

            Lugert, Oklahoma