rsyslog should re-create the file (with correct permissions)after restarting:
# mv /var/log/syslog /tmp/
# /etc/init.d/rsyslog restart
[ ok ] Restarting rsyslog (via systemctl): rsyslog.service.
# dir /var/log/syslog
-rw-r----- 1 root adm 327 Oct 27 13:28 /var/log/syslog
Perhaps try forcing a log entry to make sure it's running:
# /usr/bin/logger -p0 foo
# tail /var/log/syslog
...
Oct 27 13:31:39 myserver root: foo
I was using MSAA in my project, and have found out that the problem disappeared when I disabled it. This has lead me to discover this other question where the same problem is discussed (but not solved).
The problem seems to be that if multisampling is enabled for your main framebuffer, all of your custom FBOs have to use multisampling as well. You cannot render to a normal non-multisampled GL_TEXTURE_2D
, and a multi-sampled GL_TEXTURE_2D_MULTISAMPLE
is not available on OpenGL ES 2.
In order to fix the problem, I modified my render-to-texture code the same way I modified my main rendering code to enable multisampling. In addition to the three buffer objects created in the code from the question, I create three more for the multi-sampled rendering:
glGenFramebuffersOES(1, &wmBuffer);
glGenRenderbuffersOES(1, &wmColor);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, wmBuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, wmColor);
glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER_OES, 4, GL_RGBA8_OES, width, height);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, wmColor);
glGenRenderbuffersOES(1, &wmDepth);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, wmDepth);
glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER_OES, 4, GL_DEPTH_COMPONENT16_OES, width, height);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, wmDepth);
Before rendering to the texture, I bind the new MSAA buffer:
glBindFramebufferOES(GL_FRAMEBUFFER_OES, wmBuffer);
Finally, after rendering, I resolve the MSAA FBO into the texture FBO the same way I do for my main rendering framebuffer:
glBindFramebufferOES(GL_READ_FRAMEBUFFER_APPLE, wmBuffer);
glBindFramebufferOES(GL_DRAW_FRAMEBUFFER_APPLE, wBuffer);
glResolveMultisampleFramebufferAPPLE();
GLenum attachments[] = {GL_DEPTH_ATTACHMENT_OES, GL_COLOR_ATTACHMENT0_OES, GL_STENCIL_ATTACHMENT_OES};
glDiscardFramebufferEXT(GL_READ_FRAMEBUFFER_APPLE, 3, attachments);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, 0);
The textures are now rendered correctly (and the performance is great!)
Best Answer
This is how I'm doing it.
I define a texture variable (I use Apple's
Texture2D
class, but you can use an OpenGL texture id if you want), and a frame buffer:Then at some point, I create the texture, frame buffer and attach the renderbuffer. This you only need to do it once:
Every time I want to render to the texture, I do:
About your question 3, that's it, you can use the texture as if it is any other texture.