How do you back up your development machine? [closed]

There’s an important distinction between backing up your development machine and backing up your work. For a development machine your best bet is an imaging solution that offers as near a “one-click-restore” process as possible. TimeMachine (Mac) and Windows Home Server (Windows) are both excellent for this purpose. Not only can you have your entire … Read more

What is actually a Queue family in Vulkan?

To understand queue families, you first have to understand queues. A queue is something you submit command buffers to, and command buffers submitted to a queue are executed in order[*1] relative to each other. Command buffers submitted to different queues are unordered relative to each other unless you explicitly synchronize them with VkSemaphore. You can … Read more

To what extent is it acceptable to think of C++ pointers as memory addresses?

You should think of pointers as being addresses of virtual memory: modern consumer operating systems and runtime environments place at least one layer of abstraction between physical memory and what you see as a pointer value. As for your final statement, you cannot make that assumption, even in a virtual memory address space. Pointer arithmetic … Read more

How does cpu communicate with peripherals?

In older architectures, peripherals were accessed via a separate mechanism to memory access with special I/O instructions. On x86, there were (and still are!) “in” and “out” instructions for transferring bytes between the CPU and a peripheral. Peripherals were given addresses, for example 0x80 for the keyboard. Simplifying a lot, doing “in 0x80” would read … Read more

Is bit shifting O(1) or O(n)?

Some instruction sets are limited to one bit shift per instruction. And some instruction sets allow you to specify any number of bits to shift in one instruction, which usually takes one clock cycle on modern processors (modern being an intentionally vague word). See dan04’s answer about a barrel shifter, a circuit that shifts more … Read more

Why do we use CPUs for ray tracing instead of GPUs?

I’m one of the rendering software architects at a large VFX and animated feature studio with a proprietary renderer (not Pixar, though I was once the rendering software architect there as well, long, long ago). Almost all high-quality rendering for film (at all the big studios, with all the major renderers) is CPU only. There … Read more