Event cameras provide asynchronous, data-driven measurements of pixel-wise temporal contrast with extremely high temporal resolution over a large dynamic range. Events represent discrete pixel-wise brightness changes, and they are transmitted almost instantaneously. A key advantage over conventional cameras is that event cameras only transmit non-redundant information, rather than synchronously updating the full image frame at an arbitrary rate. This enables efficient local processing of salient image regions. I propose to estimate a continuous-time image state that is updated asynchronously based on events. Image gradient, which is important in motion perception, feature extraction and edge-detection can also be estimated from events. Finally I propose to estimate a continuous-time motion state that describes 2D optical flow, based on estimated image gradient and the precise timing of events.