Simultaneous recordings from N electrodes generate N-dimensional time series that call for efficient representations to expose relevant aspects of the underlying dynamics. Binning the time series defines a sequence of neural activity vectors that populate the N-dimensional space as a density distribution, especially informative when the neural dynamics proceeds as a noisy path through metastable states (often a case of interest in neuroscience); this makes clustering in the N-dimensional space a natural choice. We apply a variant of the 'mean-shift' algorithm to perform such clustering, and validate it on an Hopfield network in the glassy phase, in which metastable states are largely uncorrelated from memory attractors. The neural states identified as clusters' centroids are then used to define a parsimonious parametrization of the synaptic matrix, which allows a significant improvement in inferring the synaptic couplings from the neural activities. We next consider the more realistic case of a multi-modular spiking network, with spike-frequency adaptation inducing history-dependent effects; we develop a procedure, inspired by Boltzmann learning but extending its domain of application, to learn inter-module synaptic couplings so that the spiking network reproduces a prescribed pattern of spatial correlations. After clustering the activity generated by such multi-modular spiking networks, we cast their multi-dimensional dynamics in the form of the symbolic sequence of the clusters' centroids; this representation naturally lends itself to complexity estimates that provide compact information on memory effects like those induced by spike-frequency adaptation. Specifically, to obtain a relative complexity measure we compare the Lempel-Ziv complexity of the actual centroid sequence to the one of Markov processes sharing the same transition probabilities between centroids; as an illustration, we show that the dependence of such relative complexity on the characteristic time scale of spike-frequency adaptation.