Undefined symbols for architecture x86_64: “_fcloseall”











up vote
1
down vote

favorite












I am compiling hadoop-yarn-nodemanager.



Compiling environment: MacOS-10.14, java-1.7.0_80, cmake3.13.0-rc3 with clang-1000.10.44.4, Maven 3.6.0, protocbuf 2.5.0.



I'm trying to install Hadoop-2.2.0 on MacOS, but as its document shows,




The native hadoop library is supported on *nix platforms only. The library does not to work with Cygwin or the Mac OS X platform.




So I have to re-compile Hadoop's source code. In the downloaded hadoop-2.2.0-src folder, running mvn package -Pdist,native -DskipTests -Dtar, and the new native library should have been at hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/lib/native after minutes' compiling. However, I kept getting back error messages. Some already fixed with modification to source codes, but now I'm trapped compiling hadoop-yarn-server-nodemanager.



Compiling Process Now



Here's the error message:



 [exec] [ 57%] Linking C executable target/usr/local/bin/test-container-executor
[exec] /Applications/CMake.app/Contents/bin/cmake -E cmake_link_script CMakeFiles/test-container-executor.dir/link.txt --verbose=1
[exec] /Library/Developer/CommandLineTools/usr/bin/cc -g -Wall -O2 -D_GNU_SOURCE -D_REENTRANT -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk -Wl,-search_paths_first -Wl,-headerpad_max_install_names CMakeFiles/test-container-executor.dir/main/native/container-executor/test/test-container-executor.c.o -o target/usr/local/bin/test-container-executor libcontainer.a
[exec] Undefined symbols for architecture x86_64:
[exec] "_fcloseall", referenced from:
[exec] _launch_container_as_user in libcontainer.a(container-executor.c.o)
[exec] ld: symbol(s) not found for architecture x86_64
[exec] clang: error: linker command failed with exit code 1 (use -v to see invocation)
[exec] make[2]: *** [target/usr/local/bin/test-container-executor] Error 1
[exec] make[1]: *** [CMakeFiles/test-container-executor.dir/all] Error 2
[exec] make: *** [all] Error 2


I've tried to switch cmake's compiler from clang to gcc, but of no use.



Relavant to error message, I find following codes.



In hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/CMakeLists.txt:



add_executable(test-container-executor
main/native/container-executor/test/test-container-executor.c
)
target_link_libraries(test-container-executor
container
)
output_directory(test-container-executor target/usr/local/bin)


In hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/configuration.c:



int launch_container_as_user(const char *user, const char *app_id, 
const char *container_id, const char *work_dir,
const char *script_name, const char *cred_file,
const char* pid_file, char* const* local_dirs,
char* const* log_dirs, const char *resources_key,
char* const* resources_values) {...}


In hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/native/CMakeFiles/test-container-executor.dir/link.txt:



/Library/Developer/CommandLineTools/usr/bin/cc  -g -Wall -O2 -D_GNU_SOURCE -D_REENTRANT -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk -Wl,-search_paths_first -Wl,-headerpad_max_install_names  CMakeFiles/test-container-executor.dir/main/native/container-executor/test/test-container-executor.c.o  -o target/usr/local/bin/test-container-executor libcontainer.a 


As for the compressed file hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/native/libcontainer.a, I found container-executor.c.o after decompressing, but failed to open it with encoding problem.



Furthermore, an error raised previously when compiling this project:



 [exec] /Users/markdana/Downloads/hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/container-executor.c:1252:48: error: too many arguments to function call, expected 4, have 5
[exec] if (mount("none", mount_path, "cgroup", 0, controller) == 0) {
[exec] ~~~~~ ^~~~~~~~~~
[exec] /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk/usr/include/sys/mount.h:399:1: note: 'mount' declared here
[exec] int mount(const char *, const char *, int, void *);


To fix it I modified declaration of the function mount() in mount.h temperately to:



int mount(const char *, const char *, const char *,int, const char *);


It's a bit stupid and I know it, but it works at least. And then encountered the new problem showed in the question. I'm wondering whether they are concerned, or some bugs with linking library.



Having debugged for a whole day and feeling trackless about what to be done. Would appreciate it if you could point out the key, or sharing some similar experience handling cmake linking problems.










share|improve this question




























    up vote
    1
    down vote

    favorite












    I am compiling hadoop-yarn-nodemanager.



    Compiling environment: MacOS-10.14, java-1.7.0_80, cmake3.13.0-rc3 with clang-1000.10.44.4, Maven 3.6.0, protocbuf 2.5.0.



    I'm trying to install Hadoop-2.2.0 on MacOS, but as its document shows,




    The native hadoop library is supported on *nix platforms only. The library does not to work with Cygwin or the Mac OS X platform.




    So I have to re-compile Hadoop's source code. In the downloaded hadoop-2.2.0-src folder, running mvn package -Pdist,native -DskipTests -Dtar, and the new native library should have been at hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/lib/native after minutes' compiling. However, I kept getting back error messages. Some already fixed with modification to source codes, but now I'm trapped compiling hadoop-yarn-server-nodemanager.



    Compiling Process Now



    Here's the error message:



     [exec] [ 57%] Linking C executable target/usr/local/bin/test-container-executor
    [exec] /Applications/CMake.app/Contents/bin/cmake -E cmake_link_script CMakeFiles/test-container-executor.dir/link.txt --verbose=1
    [exec] /Library/Developer/CommandLineTools/usr/bin/cc -g -Wall -O2 -D_GNU_SOURCE -D_REENTRANT -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk -Wl,-search_paths_first -Wl,-headerpad_max_install_names CMakeFiles/test-container-executor.dir/main/native/container-executor/test/test-container-executor.c.o -o target/usr/local/bin/test-container-executor libcontainer.a
    [exec] Undefined symbols for architecture x86_64:
    [exec] "_fcloseall", referenced from:
    [exec] _launch_container_as_user in libcontainer.a(container-executor.c.o)
    [exec] ld: symbol(s) not found for architecture x86_64
    [exec] clang: error: linker command failed with exit code 1 (use -v to see invocation)
    [exec] make[2]: *** [target/usr/local/bin/test-container-executor] Error 1
    [exec] make[1]: *** [CMakeFiles/test-container-executor.dir/all] Error 2
    [exec] make: *** [all] Error 2


    I've tried to switch cmake's compiler from clang to gcc, but of no use.



    Relavant to error message, I find following codes.



    In hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/CMakeLists.txt:



    add_executable(test-container-executor
    main/native/container-executor/test/test-container-executor.c
    )
    target_link_libraries(test-container-executor
    container
    )
    output_directory(test-container-executor target/usr/local/bin)


    In hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/configuration.c:



    int launch_container_as_user(const char *user, const char *app_id, 
    const char *container_id, const char *work_dir,
    const char *script_name, const char *cred_file,
    const char* pid_file, char* const* local_dirs,
    char* const* log_dirs, const char *resources_key,
    char* const* resources_values) {...}


    In hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/native/CMakeFiles/test-container-executor.dir/link.txt:



    /Library/Developer/CommandLineTools/usr/bin/cc  -g -Wall -O2 -D_GNU_SOURCE -D_REENTRANT -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk -Wl,-search_paths_first -Wl,-headerpad_max_install_names  CMakeFiles/test-container-executor.dir/main/native/container-executor/test/test-container-executor.c.o  -o target/usr/local/bin/test-container-executor libcontainer.a 


    As for the compressed file hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/native/libcontainer.a, I found container-executor.c.o after decompressing, but failed to open it with encoding problem.



    Furthermore, an error raised previously when compiling this project:



     [exec] /Users/markdana/Downloads/hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/container-executor.c:1252:48: error: too many arguments to function call, expected 4, have 5
    [exec] if (mount("none", mount_path, "cgroup", 0, controller) == 0) {
    [exec] ~~~~~ ^~~~~~~~~~
    [exec] /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk/usr/include/sys/mount.h:399:1: note: 'mount' declared here
    [exec] int mount(const char *, const char *, int, void *);


    To fix it I modified declaration of the function mount() in mount.h temperately to:



    int mount(const char *, const char *, const char *,int, const char *);


    It's a bit stupid and I know it, but it works at least. And then encountered the new problem showed in the question. I'm wondering whether they are concerned, or some bugs with linking library.



    Having debugged for a whole day and feeling trackless about what to be done. Would appreciate it if you could point out the key, or sharing some similar experience handling cmake linking problems.










    share|improve this question


























      up vote
      1
      down vote

      favorite









      up vote
      1
      down vote

      favorite











      I am compiling hadoop-yarn-nodemanager.



      Compiling environment: MacOS-10.14, java-1.7.0_80, cmake3.13.0-rc3 with clang-1000.10.44.4, Maven 3.6.0, protocbuf 2.5.0.



      I'm trying to install Hadoop-2.2.0 on MacOS, but as its document shows,




      The native hadoop library is supported on *nix platforms only. The library does not to work with Cygwin or the Mac OS X platform.




      So I have to re-compile Hadoop's source code. In the downloaded hadoop-2.2.0-src folder, running mvn package -Pdist,native -DskipTests -Dtar, and the new native library should have been at hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/lib/native after minutes' compiling. However, I kept getting back error messages. Some already fixed with modification to source codes, but now I'm trapped compiling hadoop-yarn-server-nodemanager.



      Compiling Process Now



      Here's the error message:



       [exec] [ 57%] Linking C executable target/usr/local/bin/test-container-executor
      [exec] /Applications/CMake.app/Contents/bin/cmake -E cmake_link_script CMakeFiles/test-container-executor.dir/link.txt --verbose=1
      [exec] /Library/Developer/CommandLineTools/usr/bin/cc -g -Wall -O2 -D_GNU_SOURCE -D_REENTRANT -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk -Wl,-search_paths_first -Wl,-headerpad_max_install_names CMakeFiles/test-container-executor.dir/main/native/container-executor/test/test-container-executor.c.o -o target/usr/local/bin/test-container-executor libcontainer.a
      [exec] Undefined symbols for architecture x86_64:
      [exec] "_fcloseall", referenced from:
      [exec] _launch_container_as_user in libcontainer.a(container-executor.c.o)
      [exec] ld: symbol(s) not found for architecture x86_64
      [exec] clang: error: linker command failed with exit code 1 (use -v to see invocation)
      [exec] make[2]: *** [target/usr/local/bin/test-container-executor] Error 1
      [exec] make[1]: *** [CMakeFiles/test-container-executor.dir/all] Error 2
      [exec] make: *** [all] Error 2


      I've tried to switch cmake's compiler from clang to gcc, but of no use.



      Relavant to error message, I find following codes.



      In hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/CMakeLists.txt:



      add_executable(test-container-executor
      main/native/container-executor/test/test-container-executor.c
      )
      target_link_libraries(test-container-executor
      container
      )
      output_directory(test-container-executor target/usr/local/bin)


      In hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/configuration.c:



      int launch_container_as_user(const char *user, const char *app_id, 
      const char *container_id, const char *work_dir,
      const char *script_name, const char *cred_file,
      const char* pid_file, char* const* local_dirs,
      char* const* log_dirs, const char *resources_key,
      char* const* resources_values) {...}


      In hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/native/CMakeFiles/test-container-executor.dir/link.txt:



      /Library/Developer/CommandLineTools/usr/bin/cc  -g -Wall -O2 -D_GNU_SOURCE -D_REENTRANT -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk -Wl,-search_paths_first -Wl,-headerpad_max_install_names  CMakeFiles/test-container-executor.dir/main/native/container-executor/test/test-container-executor.c.o  -o target/usr/local/bin/test-container-executor libcontainer.a 


      As for the compressed file hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/native/libcontainer.a, I found container-executor.c.o after decompressing, but failed to open it with encoding problem.



      Furthermore, an error raised previously when compiling this project:



       [exec] /Users/markdana/Downloads/hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/container-executor.c:1252:48: error: too many arguments to function call, expected 4, have 5
      [exec] if (mount("none", mount_path, "cgroup", 0, controller) == 0) {
      [exec] ~~~~~ ^~~~~~~~~~
      [exec] /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk/usr/include/sys/mount.h:399:1: note: 'mount' declared here
      [exec] int mount(const char *, const char *, int, void *);


      To fix it I modified declaration of the function mount() in mount.h temperately to:



      int mount(const char *, const char *, const char *,int, const char *);


      It's a bit stupid and I know it, but it works at least. And then encountered the new problem showed in the question. I'm wondering whether they are concerned, or some bugs with linking library.



      Having debugged for a whole day and feeling trackless about what to be done. Would appreciate it if you could point out the key, or sharing some similar experience handling cmake linking problems.










      share|improve this question















      I am compiling hadoop-yarn-nodemanager.



      Compiling environment: MacOS-10.14, java-1.7.0_80, cmake3.13.0-rc3 with clang-1000.10.44.4, Maven 3.6.0, protocbuf 2.5.0.



      I'm trying to install Hadoop-2.2.0 on MacOS, but as its document shows,




      The native hadoop library is supported on *nix platforms only. The library does not to work with Cygwin or the Mac OS X platform.




      So I have to re-compile Hadoop's source code. In the downloaded hadoop-2.2.0-src folder, running mvn package -Pdist,native -DskipTests -Dtar, and the new native library should have been at hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/lib/native after minutes' compiling. However, I kept getting back error messages. Some already fixed with modification to source codes, but now I'm trapped compiling hadoop-yarn-server-nodemanager.



      Compiling Process Now



      Here's the error message:



       [exec] [ 57%] Linking C executable target/usr/local/bin/test-container-executor
      [exec] /Applications/CMake.app/Contents/bin/cmake -E cmake_link_script CMakeFiles/test-container-executor.dir/link.txt --verbose=1
      [exec] /Library/Developer/CommandLineTools/usr/bin/cc -g -Wall -O2 -D_GNU_SOURCE -D_REENTRANT -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk -Wl,-search_paths_first -Wl,-headerpad_max_install_names CMakeFiles/test-container-executor.dir/main/native/container-executor/test/test-container-executor.c.o -o target/usr/local/bin/test-container-executor libcontainer.a
      [exec] Undefined symbols for architecture x86_64:
      [exec] "_fcloseall", referenced from:
      [exec] _launch_container_as_user in libcontainer.a(container-executor.c.o)
      [exec] ld: symbol(s) not found for architecture x86_64
      [exec] clang: error: linker command failed with exit code 1 (use -v to see invocation)
      [exec] make[2]: *** [target/usr/local/bin/test-container-executor] Error 1
      [exec] make[1]: *** [CMakeFiles/test-container-executor.dir/all] Error 2
      [exec] make: *** [all] Error 2


      I've tried to switch cmake's compiler from clang to gcc, but of no use.



      Relavant to error message, I find following codes.



      In hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/CMakeLists.txt:



      add_executable(test-container-executor
      main/native/container-executor/test/test-container-executor.c
      )
      target_link_libraries(test-container-executor
      container
      )
      output_directory(test-container-executor target/usr/local/bin)


      In hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/configuration.c:



      int launch_container_as_user(const char *user, const char *app_id, 
      const char *container_id, const char *work_dir,
      const char *script_name, const char *cred_file,
      const char* pid_file, char* const* local_dirs,
      char* const* log_dirs, const char *resources_key,
      char* const* resources_values) {...}


      In hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/native/CMakeFiles/test-container-executor.dir/link.txt:



      /Library/Developer/CommandLineTools/usr/bin/cc  -g -Wall -O2 -D_GNU_SOURCE -D_REENTRANT -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk -Wl,-search_paths_first -Wl,-headerpad_max_install_names  CMakeFiles/test-container-executor.dir/main/native/container-executor/test/test-container-executor.c.o  -o target/usr/local/bin/test-container-executor libcontainer.a 


      As for the compressed file hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/native/libcontainer.a, I found container-executor.c.o after decompressing, but failed to open it with encoding problem.



      Furthermore, an error raised previously when compiling this project:



       [exec] /Users/markdana/Downloads/hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/container-executor.c:1252:48: error: too many arguments to function call, expected 4, have 5
      [exec] if (mount("none", mount_path, "cgroup", 0, controller) == 0) {
      [exec] ~~~~~ ^~~~~~~~~~
      [exec] /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk/usr/include/sys/mount.h:399:1: note: 'mount' declared here
      [exec] int mount(const char *, const char *, int, void *);


      To fix it I modified declaration of the function mount() in mount.h temperately to:



      int mount(const char *, const char *, const char *,int, const char *);


      It's a bit stupid and I know it, but it works at least. And then encountered the new problem showed in the question. I'm wondering whether they are concerned, or some bugs with linking library.



      Having debugged for a whole day and feeling trackless about what to be done. Would appreciate it if you could point out the key, or sharing some similar experience handling cmake linking problems.







      java xcode macos hadoop cmake






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Nov 12 at 7:52









      Tsyvarev

      25.2k32456




      25.2k32456










      asked Nov 11 at 12:27









      Dana Mark

      82




      82
























          1 Answer
          1






          active

          oldest

          votes

















          up vote
          1
          down vote



          accepted










          It seems that function fcloseall doesn't exist on OS X. From the Porting UNIX/Linux Applications to OS X:




          fcloseall



          This function is an extension to fclose. Although OS X supports fclose, fcloseall is not supported. You can use fclose to implement fcloseall by storing the file pointers in an array and iterating through the array.




          You need to redesign the application and store every file which is supposed to be closed with fcloseall. After that, you may use simple close for every such file, as noted in the citation.






          share|improve this answer





















          • Thanks very much and I've successfully built it. In the file container-executor.c I searched open and close, finding command in linux C, and checked whether they matched. And since before fcloseall(), no fopen appears, with fclose(stdin), fclose(stdout), fclose(stderr), I just delete fcloseall(), and it works! Thanks again!
            – Dana Mark
            Nov 12 at 2:16











          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53248756%2fundefined-symbols-for-architecture-x86-64-fcloseall%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          1
          down vote



          accepted










          It seems that function fcloseall doesn't exist on OS X. From the Porting UNIX/Linux Applications to OS X:




          fcloseall



          This function is an extension to fclose. Although OS X supports fclose, fcloseall is not supported. You can use fclose to implement fcloseall by storing the file pointers in an array and iterating through the array.




          You need to redesign the application and store every file which is supposed to be closed with fcloseall. After that, you may use simple close for every such file, as noted in the citation.






          share|improve this answer





















          • Thanks very much and I've successfully built it. In the file container-executor.c I searched open and close, finding command in linux C, and checked whether they matched. And since before fcloseall(), no fopen appears, with fclose(stdin), fclose(stdout), fclose(stderr), I just delete fcloseall(), and it works! Thanks again!
            – Dana Mark
            Nov 12 at 2:16















          up vote
          1
          down vote



          accepted










          It seems that function fcloseall doesn't exist on OS X. From the Porting UNIX/Linux Applications to OS X:




          fcloseall



          This function is an extension to fclose. Although OS X supports fclose, fcloseall is not supported. You can use fclose to implement fcloseall by storing the file pointers in an array and iterating through the array.




          You need to redesign the application and store every file which is supposed to be closed with fcloseall. After that, you may use simple close for every such file, as noted in the citation.






          share|improve this answer





















          • Thanks very much and I've successfully built it. In the file container-executor.c I searched open and close, finding command in linux C, and checked whether they matched. And since before fcloseall(), no fopen appears, with fclose(stdin), fclose(stdout), fclose(stderr), I just delete fcloseall(), and it works! Thanks again!
            – Dana Mark
            Nov 12 at 2:16













          up vote
          1
          down vote



          accepted







          up vote
          1
          down vote



          accepted






          It seems that function fcloseall doesn't exist on OS X. From the Porting UNIX/Linux Applications to OS X:




          fcloseall



          This function is an extension to fclose. Although OS X supports fclose, fcloseall is not supported. You can use fclose to implement fcloseall by storing the file pointers in an array and iterating through the array.




          You need to redesign the application and store every file which is supposed to be closed with fcloseall. After that, you may use simple close for every such file, as noted in the citation.






          share|improve this answer












          It seems that function fcloseall doesn't exist on OS X. From the Porting UNIX/Linux Applications to OS X:




          fcloseall



          This function is an extension to fclose. Although OS X supports fclose, fcloseall is not supported. You can use fclose to implement fcloseall by storing the file pointers in an array and iterating through the array.




          You need to redesign the application and store every file which is supposed to be closed with fcloseall. After that, you may use simple close for every such file, as noted in the citation.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 11 at 13:50









          Tsyvarev

          25.2k32456




          25.2k32456












          • Thanks very much and I've successfully built it. In the file container-executor.c I searched open and close, finding command in linux C, and checked whether they matched. And since before fcloseall(), no fopen appears, with fclose(stdin), fclose(stdout), fclose(stderr), I just delete fcloseall(), and it works! Thanks again!
            – Dana Mark
            Nov 12 at 2:16


















          • Thanks very much and I've successfully built it. In the file container-executor.c I searched open and close, finding command in linux C, and checked whether they matched. And since before fcloseall(), no fopen appears, with fclose(stdin), fclose(stdout), fclose(stderr), I just delete fcloseall(), and it works! Thanks again!
            – Dana Mark
            Nov 12 at 2:16
















          Thanks very much and I've successfully built it. In the file container-executor.c I searched open and close, finding command in linux C, and checked whether they matched. And since before fcloseall(), no fopen appears, with fclose(stdin), fclose(stdout), fclose(stderr), I just delete fcloseall(), and it works! Thanks again!
          – Dana Mark
          Nov 12 at 2:16




          Thanks very much and I've successfully built it. In the file container-executor.c I searched open and close, finding command in linux C, and checked whether they matched. And since before fcloseall(), no fopen appears, with fclose(stdin), fclose(stdout), fclose(stderr), I just delete fcloseall(), and it works! Thanks again!
          – Dana Mark
          Nov 12 at 2:16


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.





          Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


          Please pay close attention to the following guidance:


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53248756%2fundefined-symbols-for-architecture-x86-64-fcloseall%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Florida Star v. B. J. F.

          Danny Elfman

          Lugert, Oklahoma