Grasping movements are known to activate the fronto-parietal brain networks both in human and non-human primates. However, it is unclear if these activations represent properties of the objects or hand postures or both at different stages of the movement. We manipulated the intrinsic properties of the objects and the grasping types in order to create twelve unique combinations of grasping movements and we investigated, in healthy adult humans, the low-frequency time-domain EEG representation of grasping over different stages of the movement. Next, we implemented two multiclass decoders for the grasp type and objects' properties and evaluated them over time. Furthermore, we investigated the similarity between these grasping EEG representations and categorical models that encode properties of the movement and intrinsic properties of the objects. We found that properties of the grasping movement (grasp types, number of fingers) and intrinsic object properties (shape and size) as represented in EEG are encoded in different brain areas throughout the movement stages. Both object properties and grasp types can be decoded significantly above chance level using low-frequency EEG activity during the planning and execution of the movement. Moreover, we found that this preferential time-wise encoding allows the decoding of object properties already from the observation stage, while the grasp type can also be accurately decoded at the object release stage. These findings contribute to the understanding of the grasping representation based on noninvasive EEG brain signals, and its evolution over the course of movement in relation to categorical models that describe the grasped object's properties or that encode properties of the grasping movement. Moreover, our multiclass grasping decoders are informative for the design and implementation of noninvasive motor control strategies.