Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use the correct pixel formats for OpenGL on big endian #11889

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 13 additions & 11 deletions src/render/opengl/SDL_render_gl.c
Original file line number Diff line number Diff line change
Expand Up @@ -406,14 +406,14 @@ static bool GL_SupportsBlendMode(SDL_Renderer *renderer, SDL_BlendMode blendMode
static bool convert_format(Uint32 pixel_format, GLint *internalFormat, GLenum *format, GLenum *type)
{
switch (pixel_format) {
case SDL_PIXELFORMAT_ARGB8888:
case SDL_PIXELFORMAT_XRGB8888:
case SDL_PIXELFORMAT_BGRA32:
case SDL_PIXELFORMAT_BGRX32:
*internalFormat = GL_RGBA8;
*format = GL_BGRA;
*type = GL_UNSIGNED_BYTE; // previously GL_UNSIGNED_INT_8_8_8_8_REV, seeing if this is better in modern times.
break;
case SDL_PIXELFORMAT_ABGR8888:
case SDL_PIXELFORMAT_XBGR8888:
case SDL_PIXELFORMAT_RGBA32:
case SDL_PIXELFORMAT_RGBX32:
*internalFormat = GL_RGBA8;
*format = GL_RGBA;
*type = GL_UNSIGNED_BYTE; // previously GL_UNSIGNED_INT_8_8_8_8_REV, seeing if this is better in modern times.
Expand Down Expand Up @@ -558,7 +558,8 @@ static bool GL_CreateTexture(SDL_Renderer *renderer, SDL_Texture *texture, SDL_P
renderdata->glTexParameteri(textype, GL_TEXTURE_STORAGE_HINT_APPLE,
GL_STORAGE_CACHED_APPLE);
}
if (texture->access == SDL_TEXTUREACCESS_STREAMING && texture->format == SDL_PIXELFORMAT_ARGB8888 && (texture->w % 8) == 0) {
if (!SDL_ISPIXELFORMAT_FOURCC(texture->format) && SDL_PIXELLAYOUT(texture->format) == SDL_PACKEDLAYOUT_8888 &&
texture->access == SDL_TEXTUREACCESS_STREAMING && (texture->w % 8) == 0) {
renderdata->glPixelStorei(GL_UNPACK_CLIENT_STORAGE_APPLE, GL_TRUE);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm pretty sure the pixel format has to be SDL_PIXELFORMAT_ARGB8888 in order to use this extension.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this requirement documented somewhere? And in a similar vein, does this also require GL_UNSIGNED_INT_8_8_8_8_REV to be used as the texture type?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the documentation:
https://registry.khronos.org/OpenGL/extensions/APPLE/APPLE_client_storage.txt
But it doesn't mention any of these requirements, so maybe it's okay as-is?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The official Apple documentation mentions this:

The best format and data type combinations to use for texture data are:

GL_BGRA, GL_UNSIGNED_INT_8_8_8_8_REV
GL_BGRA, GL_UNSIGNED_SHORT_1_5_5_5_REV)
GL_YCBCR_422_APPLE, GL_UNSIGNED_SHORT_8_8_REV_APPLE
The combination GL_RGBA and GL_UNSIGNED_BYTE needs to be swizzled by many cards when the data is loaded, so it's not recommended.

It's unclear if it's strictly necessary to use those formats in order to use GL_UNPACK_CLIENT_STORAGE_APPLE, although it would make sense if the benefit of using it is negated when swizzling is required.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense. Do you want to update the PR to reflect this?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The combination GL_RGBA and GL_UNSIGNED_BYTE needs to be swizzled by many cards when the data is loaded, so it's not recommended.

This was certainly true in the early 2000's when this extension was written. :)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense. Do you want to update the PR to reflect this?

We're wrapping up the 3.2.0 release, @ccawley2011, so if you want to make updates to this PR, this is the time to do it (or tell us the updates aren't necessary and we'll merge as-is, if that's the right thing to do).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this should be OK as-is for now. I'll open a separate issue regarding optimal formats on Mac OS X.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should drop this part of the PR, and create a separate one for this. I'm not convinced this is the right logic, and changing when the Apple extension is used is conceptually a different change than the rest of this PR.

renderdata->glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
renderdata->glPixelStorei(GL_UNPACK_ROW_LENGTH,
Expand Down Expand Up @@ -635,7 +636,7 @@ static bool GL_CreateTexture(SDL_Renderer *renderer, SDL_Texture *texture, SDL_P
}
#endif

if (texture->format == SDL_PIXELFORMAT_ABGR8888 || texture->format == SDL_PIXELFORMAT_ARGB8888) {
if (texture->format == SDL_PIXELFORMAT_RGBA32 || texture->format == SDL_PIXELFORMAT_BGRA32) {
data->shader = SHADER_RGBA;
} else {
data->shader = SHADER_RGB;
Expand Down Expand Up @@ -1462,7 +1463,7 @@ static bool GL_RunCommandQueue(SDL_Renderer *renderer, SDL_RenderCommand *cmd, v
static SDL_Surface *GL_RenderReadPixels(SDL_Renderer *renderer, const SDL_Rect *rect)
{
GL_RenderData *data = (GL_RenderData *)renderer->internal;
SDL_PixelFormat format = renderer->target ? renderer->target->format : SDL_PIXELFORMAT_ARGB8888;
SDL_PixelFormat format = renderer->target ? renderer->target->format : SDL_PIXELFORMAT_RGBA32;
GLint internalFormat;
GLenum targetFormat, type;
SDL_Surface *surface;
Expand Down Expand Up @@ -1675,10 +1676,11 @@ static bool GL_CreateRenderer(SDL_Renderer *renderer, SDL_Window *window, SDL_Pr
renderer->window = window;

renderer->name = GL_RenderDriver.name;
SDL_AddSupportedTextureFormat(renderer, SDL_PIXELFORMAT_ARGB8888);
SDL_AddSupportedTextureFormat(renderer, SDL_PIXELFORMAT_ABGR8888);
SDL_AddSupportedTextureFormat(renderer, SDL_PIXELFORMAT_XRGB8888);
SDL_AddSupportedTextureFormat(renderer, SDL_PIXELFORMAT_XBGR8888);
SDL_AddSupportedTextureFormat(renderer, SDL_PIXELFORMAT_RGBA32);
/* TODO: Check for required extensions? */
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does something still need to be done here?

SDL_AddSupportedTextureFormat(renderer, SDL_PIXELFORMAT_BGRA32);
SDL_AddSupportedTextureFormat(renderer, SDL_PIXELFORMAT_RGBX32);
SDL_AddSupportedTextureFormat(renderer, SDL_PIXELFORMAT_BGRX32);

data->context = SDL_GL_CreateContext(window);
if (!data->context) {
Expand Down
Loading