glMapBuffer returns NULL on Mac, works on Windows just fine
up vote
0
down vote
favorite
I'm having a weird issue with trying to move my working game code from Windows to Mac. I'm using SDL2 and OpenGL.
The game is a simple 2D game, basically only doing 2D quad/sprite rendering. The rendering architecture is simple. I'm using a single static Element Array Buffer that is prefilled at startup (6 indices, 4 vertices per quad), and every frame I push a new VBO with the sprite data (4 vertices per sprite, each containing xy position, color to modulate, and uv texture coordinates).
I'm doing this by calling glMapBuffer(GL_ARRAY_BUFFER, GL_WRITE_ONLY)
at the start of the frame, push onto the given "array" the sprite data, and then glUnmapBuffer(GL_ARRAY_BUFFER)
at the end before calling SDL_GL_SwapWindow
.
All seems to work just fine on Windows, however when I tried compiling and running on a Mac, it complied just fine, but whenever I call glMapBuffer
it returns NULL
.
I tried looking for GL errors, but no luck. Calling glGetError()
does not help as it returns 0.
It's possible that I have an issue somewhere and actually doing something wrong also on Windows (as the code is literally the same). Could be that the driver on Windows is just more lenient and "let's the error slide" but on my Mac it can't.
I'm literally stumped... I don't know where to go from here. I tried littering my code with glGetError()
but could not find a non zero return from it anywhere I tried.
Here is the setup code if it helps:
glGenVertexArrays(1, &overlay_vao);
glBindVertexArray(overlay_vao);
glGenBuffers(1, &overlay_vbo);
glBindBuffer(GL_ARRAY_BUFFER, overlay_vbo);
glBufferData(GL_ARRAY_BUFFER, MAX_OVERLAY_ELEMENTS * 4 * 8 * 4, 0, GL_STREAM_DRAW);
glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 8 * 4, 0);
glVertexAttribPointer(1, 4, GL_FLOAT, GL_FALSE, 8 * 4, (GLvoid*)(2 * 4));
glVertexAttribPointer(2, 2, GL_FLOAT, GL_FALSE, 8 * 4, (GLvoid*)(6 * 4));
glEnableVertexAttribArray(0);
glEnableVertexAttribArray(1);
glEnableVertexAttribArray(2);
GLuint overlay_element_buffer;
glGenBuffers(1, &overlay_element_buffer);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, overlay_element_buffer);
Uint8 *indicesBytes = AcquireTempMemory();
Uint16 *indices = (Uint16*)indicesBytes;
for (int i = 0; i < MAX_OVERLAY_ELEMENTS; i++)
{
indices[6 * i + 0] = 4 * i + 0;
indices[6 * i + 1] = 4 * i + 1;
indices[6 * i + 2] = 4 * i + 2;
indices[6 * i + 3] = 4 * i + 0;
indices[6 * i + 4] = 4 * i + 2;
indices[6 * i + 5] = 4 * i + 3;
}
glBufferData(GL_ELEMENT_ARRAY_BUFFER, MAX_OVERLAY_ELEMENTS * 6 * 2, indicesBytes, GL_STATIC_DRAW);
AcquireTempMemory
is basically not much more than a malloc, and I validated that it allocates fine, and the array is filled as expected (on both versions).
On start of every frame, I bind the VAO and the shader program (even though there is only one of each anyway), set some uniforms, and map the buffer:
glEnable(GL_BLEND);
glUseProgram(renderState.overlayShaderProgram);
glBindVertexArray(renderState.overlayVao);
glUniform1f(renderState.xMultUniformLocation, 1.0f / renderState.aspectRatio);
glUniform1i(renderState.textureUniformLocation, 0);
renderState.overlayRects = 0;
renderState.overlayBuffer = glMapBuffer(GL_ARRAY_BUFFER, GL_WRITE_ONLY);
Then once all sprites have been pushed:
glUnmapBuffer(GL_ARRAY_BUFFER);
glDrawElements(GL_TRIANGLES, renderState.overlayRects * 6, GL_UNSIGNED_SHORT, (const GLvoid*)0);
(after that there is a call to SDL_GL_SwapWindow
)
I'm not sure if it's relevant, but I'm getting the GL functions from SDL_GL_GetProcAddress
like this:
glMapBuffer = (glMapBuffer_TYPE)SDL_GL_GetProcAddress("glMapBuffer");
glUnmapBuffer = (glUnmapBuffer_TYPE)SDL_GL_GetProcAddress("glUnmapBuffer");
I'm really stuck... has anyone ever seen something like this, or can point me to something I'm doing wrong?
opengl sdl sdl-2
add a comment |
up vote
0
down vote
favorite
I'm having a weird issue with trying to move my working game code from Windows to Mac. I'm using SDL2 and OpenGL.
The game is a simple 2D game, basically only doing 2D quad/sprite rendering. The rendering architecture is simple. I'm using a single static Element Array Buffer that is prefilled at startup (6 indices, 4 vertices per quad), and every frame I push a new VBO with the sprite data (4 vertices per sprite, each containing xy position, color to modulate, and uv texture coordinates).
I'm doing this by calling glMapBuffer(GL_ARRAY_BUFFER, GL_WRITE_ONLY)
at the start of the frame, push onto the given "array" the sprite data, and then glUnmapBuffer(GL_ARRAY_BUFFER)
at the end before calling SDL_GL_SwapWindow
.
All seems to work just fine on Windows, however when I tried compiling and running on a Mac, it complied just fine, but whenever I call glMapBuffer
it returns NULL
.
I tried looking for GL errors, but no luck. Calling glGetError()
does not help as it returns 0.
It's possible that I have an issue somewhere and actually doing something wrong also on Windows (as the code is literally the same). Could be that the driver on Windows is just more lenient and "let's the error slide" but on my Mac it can't.
I'm literally stumped... I don't know where to go from here. I tried littering my code with glGetError()
but could not find a non zero return from it anywhere I tried.
Here is the setup code if it helps:
glGenVertexArrays(1, &overlay_vao);
glBindVertexArray(overlay_vao);
glGenBuffers(1, &overlay_vbo);
glBindBuffer(GL_ARRAY_BUFFER, overlay_vbo);
glBufferData(GL_ARRAY_BUFFER, MAX_OVERLAY_ELEMENTS * 4 * 8 * 4, 0, GL_STREAM_DRAW);
glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 8 * 4, 0);
glVertexAttribPointer(1, 4, GL_FLOAT, GL_FALSE, 8 * 4, (GLvoid*)(2 * 4));
glVertexAttribPointer(2, 2, GL_FLOAT, GL_FALSE, 8 * 4, (GLvoid*)(6 * 4));
glEnableVertexAttribArray(0);
glEnableVertexAttribArray(1);
glEnableVertexAttribArray(2);
GLuint overlay_element_buffer;
glGenBuffers(1, &overlay_element_buffer);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, overlay_element_buffer);
Uint8 *indicesBytes = AcquireTempMemory();
Uint16 *indices = (Uint16*)indicesBytes;
for (int i = 0; i < MAX_OVERLAY_ELEMENTS; i++)
{
indices[6 * i + 0] = 4 * i + 0;
indices[6 * i + 1] = 4 * i + 1;
indices[6 * i + 2] = 4 * i + 2;
indices[6 * i + 3] = 4 * i + 0;
indices[6 * i + 4] = 4 * i + 2;
indices[6 * i + 5] = 4 * i + 3;
}
glBufferData(GL_ELEMENT_ARRAY_BUFFER, MAX_OVERLAY_ELEMENTS * 6 * 2, indicesBytes, GL_STATIC_DRAW);
AcquireTempMemory
is basically not much more than a malloc, and I validated that it allocates fine, and the array is filled as expected (on both versions).
On start of every frame, I bind the VAO and the shader program (even though there is only one of each anyway), set some uniforms, and map the buffer:
glEnable(GL_BLEND);
glUseProgram(renderState.overlayShaderProgram);
glBindVertexArray(renderState.overlayVao);
glUniform1f(renderState.xMultUniformLocation, 1.0f / renderState.aspectRatio);
glUniform1i(renderState.textureUniformLocation, 0);
renderState.overlayRects = 0;
renderState.overlayBuffer = glMapBuffer(GL_ARRAY_BUFFER, GL_WRITE_ONLY);
Then once all sprites have been pushed:
glUnmapBuffer(GL_ARRAY_BUFFER);
glDrawElements(GL_TRIANGLES, renderState.overlayRects * 6, GL_UNSIGNED_SHORT, (const GLvoid*)0);
(after that there is a call to SDL_GL_SwapWindow
)
I'm not sure if it's relevant, but I'm getting the GL functions from SDL_GL_GetProcAddress
like this:
glMapBuffer = (glMapBuffer_TYPE)SDL_GL_GetProcAddress("glMapBuffer");
glUnmapBuffer = (glUnmapBuffer_TYPE)SDL_GL_GetProcAddress("glUnmapBuffer");
I'm really stuck... has anyone ever seen something like this, or can point me to something I'm doing wrong?
opengl sdl sdl-2
3
First of all: which OpenGL version are you using? OSX has a couple of restrictions there. Second: this code is still far from being an MCVE, since it is totally unclear what your GL state is at the time of theglMapBuffer
call.
– derhass
Nov 11 at 12:25
I'm using OpenGL 3.3, I know it's far from full, there is a a lot more, but I did not want to clutter the entire question with every single piece of GL related code I have..
– Bob
Nov 11 at 12:37
3
3.3 core profile? Did you actually print theGL_VERSION
string on the mac?
– derhass
Nov 11 at 12:59
I guess I have a much deeper issue here I'm missing..glGetString(GL_VERSION)
returns NULL as well.. I guess something went really wrong with creating the GL Context or something like this... OK at least I have somewhat of a lead now... thanks
– Bob
Nov 11 at 16:00
1
Well, what context version do you request?
– derhass
Nov 11 at 16:19
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I'm having a weird issue with trying to move my working game code from Windows to Mac. I'm using SDL2 and OpenGL.
The game is a simple 2D game, basically only doing 2D quad/sprite rendering. The rendering architecture is simple. I'm using a single static Element Array Buffer that is prefilled at startup (6 indices, 4 vertices per quad), and every frame I push a new VBO with the sprite data (4 vertices per sprite, each containing xy position, color to modulate, and uv texture coordinates).
I'm doing this by calling glMapBuffer(GL_ARRAY_BUFFER, GL_WRITE_ONLY)
at the start of the frame, push onto the given "array" the sprite data, and then glUnmapBuffer(GL_ARRAY_BUFFER)
at the end before calling SDL_GL_SwapWindow
.
All seems to work just fine on Windows, however when I tried compiling and running on a Mac, it complied just fine, but whenever I call glMapBuffer
it returns NULL
.
I tried looking for GL errors, but no luck. Calling glGetError()
does not help as it returns 0.
It's possible that I have an issue somewhere and actually doing something wrong also on Windows (as the code is literally the same). Could be that the driver on Windows is just more lenient and "let's the error slide" but on my Mac it can't.
I'm literally stumped... I don't know where to go from here. I tried littering my code with glGetError()
but could not find a non zero return from it anywhere I tried.
Here is the setup code if it helps:
glGenVertexArrays(1, &overlay_vao);
glBindVertexArray(overlay_vao);
glGenBuffers(1, &overlay_vbo);
glBindBuffer(GL_ARRAY_BUFFER, overlay_vbo);
glBufferData(GL_ARRAY_BUFFER, MAX_OVERLAY_ELEMENTS * 4 * 8 * 4, 0, GL_STREAM_DRAW);
glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 8 * 4, 0);
glVertexAttribPointer(1, 4, GL_FLOAT, GL_FALSE, 8 * 4, (GLvoid*)(2 * 4));
glVertexAttribPointer(2, 2, GL_FLOAT, GL_FALSE, 8 * 4, (GLvoid*)(6 * 4));
glEnableVertexAttribArray(0);
glEnableVertexAttribArray(1);
glEnableVertexAttribArray(2);
GLuint overlay_element_buffer;
glGenBuffers(1, &overlay_element_buffer);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, overlay_element_buffer);
Uint8 *indicesBytes = AcquireTempMemory();
Uint16 *indices = (Uint16*)indicesBytes;
for (int i = 0; i < MAX_OVERLAY_ELEMENTS; i++)
{
indices[6 * i + 0] = 4 * i + 0;
indices[6 * i + 1] = 4 * i + 1;
indices[6 * i + 2] = 4 * i + 2;
indices[6 * i + 3] = 4 * i + 0;
indices[6 * i + 4] = 4 * i + 2;
indices[6 * i + 5] = 4 * i + 3;
}
glBufferData(GL_ELEMENT_ARRAY_BUFFER, MAX_OVERLAY_ELEMENTS * 6 * 2, indicesBytes, GL_STATIC_DRAW);
AcquireTempMemory
is basically not much more than a malloc, and I validated that it allocates fine, and the array is filled as expected (on both versions).
On start of every frame, I bind the VAO and the shader program (even though there is only one of each anyway), set some uniforms, and map the buffer:
glEnable(GL_BLEND);
glUseProgram(renderState.overlayShaderProgram);
glBindVertexArray(renderState.overlayVao);
glUniform1f(renderState.xMultUniformLocation, 1.0f / renderState.aspectRatio);
glUniform1i(renderState.textureUniformLocation, 0);
renderState.overlayRects = 0;
renderState.overlayBuffer = glMapBuffer(GL_ARRAY_BUFFER, GL_WRITE_ONLY);
Then once all sprites have been pushed:
glUnmapBuffer(GL_ARRAY_BUFFER);
glDrawElements(GL_TRIANGLES, renderState.overlayRects * 6, GL_UNSIGNED_SHORT, (const GLvoid*)0);
(after that there is a call to SDL_GL_SwapWindow
)
I'm not sure if it's relevant, but I'm getting the GL functions from SDL_GL_GetProcAddress
like this:
glMapBuffer = (glMapBuffer_TYPE)SDL_GL_GetProcAddress("glMapBuffer");
glUnmapBuffer = (glUnmapBuffer_TYPE)SDL_GL_GetProcAddress("glUnmapBuffer");
I'm really stuck... has anyone ever seen something like this, or can point me to something I'm doing wrong?
opengl sdl sdl-2
I'm having a weird issue with trying to move my working game code from Windows to Mac. I'm using SDL2 and OpenGL.
The game is a simple 2D game, basically only doing 2D quad/sprite rendering. The rendering architecture is simple. I'm using a single static Element Array Buffer that is prefilled at startup (6 indices, 4 vertices per quad), and every frame I push a new VBO with the sprite data (4 vertices per sprite, each containing xy position, color to modulate, and uv texture coordinates).
I'm doing this by calling glMapBuffer(GL_ARRAY_BUFFER, GL_WRITE_ONLY)
at the start of the frame, push onto the given "array" the sprite data, and then glUnmapBuffer(GL_ARRAY_BUFFER)
at the end before calling SDL_GL_SwapWindow
.
All seems to work just fine on Windows, however when I tried compiling and running on a Mac, it complied just fine, but whenever I call glMapBuffer
it returns NULL
.
I tried looking for GL errors, but no luck. Calling glGetError()
does not help as it returns 0.
It's possible that I have an issue somewhere and actually doing something wrong also on Windows (as the code is literally the same). Could be that the driver on Windows is just more lenient and "let's the error slide" but on my Mac it can't.
I'm literally stumped... I don't know where to go from here. I tried littering my code with glGetError()
but could not find a non zero return from it anywhere I tried.
Here is the setup code if it helps:
glGenVertexArrays(1, &overlay_vao);
glBindVertexArray(overlay_vao);
glGenBuffers(1, &overlay_vbo);
glBindBuffer(GL_ARRAY_BUFFER, overlay_vbo);
glBufferData(GL_ARRAY_BUFFER, MAX_OVERLAY_ELEMENTS * 4 * 8 * 4, 0, GL_STREAM_DRAW);
glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 8 * 4, 0);
glVertexAttribPointer(1, 4, GL_FLOAT, GL_FALSE, 8 * 4, (GLvoid*)(2 * 4));
glVertexAttribPointer(2, 2, GL_FLOAT, GL_FALSE, 8 * 4, (GLvoid*)(6 * 4));
glEnableVertexAttribArray(0);
glEnableVertexAttribArray(1);
glEnableVertexAttribArray(2);
GLuint overlay_element_buffer;
glGenBuffers(1, &overlay_element_buffer);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, overlay_element_buffer);
Uint8 *indicesBytes = AcquireTempMemory();
Uint16 *indices = (Uint16*)indicesBytes;
for (int i = 0; i < MAX_OVERLAY_ELEMENTS; i++)
{
indices[6 * i + 0] = 4 * i + 0;
indices[6 * i + 1] = 4 * i + 1;
indices[6 * i + 2] = 4 * i + 2;
indices[6 * i + 3] = 4 * i + 0;
indices[6 * i + 4] = 4 * i + 2;
indices[6 * i + 5] = 4 * i + 3;
}
glBufferData(GL_ELEMENT_ARRAY_BUFFER, MAX_OVERLAY_ELEMENTS * 6 * 2, indicesBytes, GL_STATIC_DRAW);
AcquireTempMemory
is basically not much more than a malloc, and I validated that it allocates fine, and the array is filled as expected (on both versions).
On start of every frame, I bind the VAO and the shader program (even though there is only one of each anyway), set some uniforms, and map the buffer:
glEnable(GL_BLEND);
glUseProgram(renderState.overlayShaderProgram);
glBindVertexArray(renderState.overlayVao);
glUniform1f(renderState.xMultUniformLocation, 1.0f / renderState.aspectRatio);
glUniform1i(renderState.textureUniformLocation, 0);
renderState.overlayRects = 0;
renderState.overlayBuffer = glMapBuffer(GL_ARRAY_BUFFER, GL_WRITE_ONLY);
Then once all sprites have been pushed:
glUnmapBuffer(GL_ARRAY_BUFFER);
glDrawElements(GL_TRIANGLES, renderState.overlayRects * 6, GL_UNSIGNED_SHORT, (const GLvoid*)0);
(after that there is a call to SDL_GL_SwapWindow
)
I'm not sure if it's relevant, but I'm getting the GL functions from SDL_GL_GetProcAddress
like this:
glMapBuffer = (glMapBuffer_TYPE)SDL_GL_GetProcAddress("glMapBuffer");
glUnmapBuffer = (glUnmapBuffer_TYPE)SDL_GL_GetProcAddress("glUnmapBuffer");
I'm really stuck... has anyone ever seen something like this, or can point me to something I'm doing wrong?
opengl sdl sdl-2
opengl sdl sdl-2
asked Nov 11 at 11:26
Bob
1,40341525
1,40341525
3
First of all: which OpenGL version are you using? OSX has a couple of restrictions there. Second: this code is still far from being an MCVE, since it is totally unclear what your GL state is at the time of theglMapBuffer
call.
– derhass
Nov 11 at 12:25
I'm using OpenGL 3.3, I know it's far from full, there is a a lot more, but I did not want to clutter the entire question with every single piece of GL related code I have..
– Bob
Nov 11 at 12:37
3
3.3 core profile? Did you actually print theGL_VERSION
string on the mac?
– derhass
Nov 11 at 12:59
I guess I have a much deeper issue here I'm missing..glGetString(GL_VERSION)
returns NULL as well.. I guess something went really wrong with creating the GL Context or something like this... OK at least I have somewhat of a lead now... thanks
– Bob
Nov 11 at 16:00
1
Well, what context version do you request?
– derhass
Nov 11 at 16:19
add a comment |
3
First of all: which OpenGL version are you using? OSX has a couple of restrictions there. Second: this code is still far from being an MCVE, since it is totally unclear what your GL state is at the time of theglMapBuffer
call.
– derhass
Nov 11 at 12:25
I'm using OpenGL 3.3, I know it's far from full, there is a a lot more, but I did not want to clutter the entire question with every single piece of GL related code I have..
– Bob
Nov 11 at 12:37
3
3.3 core profile? Did you actually print theGL_VERSION
string on the mac?
– derhass
Nov 11 at 12:59
I guess I have a much deeper issue here I'm missing..glGetString(GL_VERSION)
returns NULL as well.. I guess something went really wrong with creating the GL Context or something like this... OK at least I have somewhat of a lead now... thanks
– Bob
Nov 11 at 16:00
1
Well, what context version do you request?
– derhass
Nov 11 at 16:19
3
3
First of all: which OpenGL version are you using? OSX has a couple of restrictions there. Second: this code is still far from being an MCVE, since it is totally unclear what your GL state is at the time of the
glMapBuffer
call.– derhass
Nov 11 at 12:25
First of all: which OpenGL version are you using? OSX has a couple of restrictions there. Second: this code is still far from being an MCVE, since it is totally unclear what your GL state is at the time of the
glMapBuffer
call.– derhass
Nov 11 at 12:25
I'm using OpenGL 3.3, I know it's far from full, there is a a lot more, but I did not want to clutter the entire question with every single piece of GL related code I have..
– Bob
Nov 11 at 12:37
I'm using OpenGL 3.3, I know it's far from full, there is a a lot more, but I did not want to clutter the entire question with every single piece of GL related code I have..
– Bob
Nov 11 at 12:37
3
3
3.3 core profile? Did you actually print the
GL_VERSION
string on the mac?– derhass
Nov 11 at 12:59
3.3 core profile? Did you actually print the
GL_VERSION
string on the mac?– derhass
Nov 11 at 12:59
I guess I have a much deeper issue here I'm missing..
glGetString(GL_VERSION)
returns NULL as well.. I guess something went really wrong with creating the GL Context or something like this... OK at least I have somewhat of a lead now... thanks– Bob
Nov 11 at 16:00
I guess I have a much deeper issue here I'm missing..
glGetString(GL_VERSION)
returns NULL as well.. I guess something went really wrong with creating the GL Context or something like this... OK at least I have somewhat of a lead now... thanks– Bob
Nov 11 at 16:00
1
1
Well, what context version do you request?
– derhass
Nov 11 at 16:19
Well, what context version do you request?
– derhass
Nov 11 at 16:19
add a comment |
1 Answer
1
active
oldest
votes
up vote
0
down vote
If you are using OpenGL 3.3, this means that your OSX version provides OGL function pointers on its own. No need to retrieve them by any GetProcAddress
call.
Even worst, you are using the same names as the GL API ones: glMapBuffer
instead of something near such as myglMapBuffer
. And Apple tells against this:
From Apple doc
Note that each function pointer uses the prefix pf to distinguish it
from the function it points to. Although using this prefix is not a
requirement, it's best to avoid using the exact function names.
In order to use the same code for Windows, Linux and OSX I recommend to warp not OSX code with #ifndef
like this:
#ifndef __APPLE__
glMapBuffer = (glMapBuffer_TYPE)SDL_GL_GetProcAddress("glMapBuffer");
glUnmapBuffer = (glUnmapBuffer_TYPE)SDL_GL_GetProcAddress("glUnmapBuffer");
//etc
#endif
Also, you may need this Apple test (if SDL doesn't take it into account):
#ifdef __APPLE__
#define GL_DO_NOT_WARN_IF_MULTI_GL_VERSION_HEADERS_INCLUDED
#include <OpenGL/gl3.h>
#endif
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
If you are using OpenGL 3.3, this means that your OSX version provides OGL function pointers on its own. No need to retrieve them by any GetProcAddress
call.
Even worst, you are using the same names as the GL API ones: glMapBuffer
instead of something near such as myglMapBuffer
. And Apple tells against this:
From Apple doc
Note that each function pointer uses the prefix pf to distinguish it
from the function it points to. Although using this prefix is not a
requirement, it's best to avoid using the exact function names.
In order to use the same code for Windows, Linux and OSX I recommend to warp not OSX code with #ifndef
like this:
#ifndef __APPLE__
glMapBuffer = (glMapBuffer_TYPE)SDL_GL_GetProcAddress("glMapBuffer");
glUnmapBuffer = (glUnmapBuffer_TYPE)SDL_GL_GetProcAddress("glUnmapBuffer");
//etc
#endif
Also, you may need this Apple test (if SDL doesn't take it into account):
#ifdef __APPLE__
#define GL_DO_NOT_WARN_IF_MULTI_GL_VERSION_HEADERS_INCLUDED
#include <OpenGL/gl3.h>
#endif
add a comment |
up vote
0
down vote
If you are using OpenGL 3.3, this means that your OSX version provides OGL function pointers on its own. No need to retrieve them by any GetProcAddress
call.
Even worst, you are using the same names as the GL API ones: glMapBuffer
instead of something near such as myglMapBuffer
. And Apple tells against this:
From Apple doc
Note that each function pointer uses the prefix pf to distinguish it
from the function it points to. Although using this prefix is not a
requirement, it's best to avoid using the exact function names.
In order to use the same code for Windows, Linux and OSX I recommend to warp not OSX code with #ifndef
like this:
#ifndef __APPLE__
glMapBuffer = (glMapBuffer_TYPE)SDL_GL_GetProcAddress("glMapBuffer");
glUnmapBuffer = (glUnmapBuffer_TYPE)SDL_GL_GetProcAddress("glUnmapBuffer");
//etc
#endif
Also, you may need this Apple test (if SDL doesn't take it into account):
#ifdef __APPLE__
#define GL_DO_NOT_WARN_IF_MULTI_GL_VERSION_HEADERS_INCLUDED
#include <OpenGL/gl3.h>
#endif
add a comment |
up vote
0
down vote
up vote
0
down vote
If you are using OpenGL 3.3, this means that your OSX version provides OGL function pointers on its own. No need to retrieve them by any GetProcAddress
call.
Even worst, you are using the same names as the GL API ones: glMapBuffer
instead of something near such as myglMapBuffer
. And Apple tells against this:
From Apple doc
Note that each function pointer uses the prefix pf to distinguish it
from the function it points to. Although using this prefix is not a
requirement, it's best to avoid using the exact function names.
In order to use the same code for Windows, Linux and OSX I recommend to warp not OSX code with #ifndef
like this:
#ifndef __APPLE__
glMapBuffer = (glMapBuffer_TYPE)SDL_GL_GetProcAddress("glMapBuffer");
glUnmapBuffer = (glUnmapBuffer_TYPE)SDL_GL_GetProcAddress("glUnmapBuffer");
//etc
#endif
Also, you may need this Apple test (if SDL doesn't take it into account):
#ifdef __APPLE__
#define GL_DO_NOT_WARN_IF_MULTI_GL_VERSION_HEADERS_INCLUDED
#include <OpenGL/gl3.h>
#endif
If you are using OpenGL 3.3, this means that your OSX version provides OGL function pointers on its own. No need to retrieve them by any GetProcAddress
call.
Even worst, you are using the same names as the GL API ones: glMapBuffer
instead of something near such as myglMapBuffer
. And Apple tells against this:
From Apple doc
Note that each function pointer uses the prefix pf to distinguish it
from the function it points to. Although using this prefix is not a
requirement, it's best to avoid using the exact function names.
In order to use the same code for Windows, Linux and OSX I recommend to warp not OSX code with #ifndef
like this:
#ifndef __APPLE__
glMapBuffer = (glMapBuffer_TYPE)SDL_GL_GetProcAddress("glMapBuffer");
glUnmapBuffer = (glUnmapBuffer_TYPE)SDL_GL_GetProcAddress("glUnmapBuffer");
//etc
#endif
Also, you may need this Apple test (if SDL doesn't take it into account):
#ifdef __APPLE__
#define GL_DO_NOT_WARN_IF_MULTI_GL_VERSION_HEADERS_INCLUDED
#include <OpenGL/gl3.h>
#endif
answered Nov 12 at 18:05
Ripi2
4,2381825
4,2381825
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53248249%2fglmapbuffer-returns-null-on-mac-works-on-windows-just-fine%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
3
First of all: which OpenGL version are you using? OSX has a couple of restrictions there. Second: this code is still far from being an MCVE, since it is totally unclear what your GL state is at the time of the
glMapBuffer
call.– derhass
Nov 11 at 12:25
I'm using OpenGL 3.3, I know it's far from full, there is a a lot more, but I did not want to clutter the entire question with every single piece of GL related code I have..
– Bob
Nov 11 at 12:37
3
3.3 core profile? Did you actually print the
GL_VERSION
string on the mac?– derhass
Nov 11 at 12:59
I guess I have a much deeper issue here I'm missing..
glGetString(GL_VERSION)
returns NULL as well.. I guess something went really wrong with creating the GL Context or something like this... OK at least I have somewhat of a lead now... thanks– Bob
Nov 11 at 16:00
1
Well, what context version do you request?
– derhass
Nov 11 at 16:19