Abstract:
Currently, there is an increasingly urgent need for the automatic analysis and understanding of the structur-al types of sketch-based tables drawn by users on touchscreen devices such as mobile devices and smart whiteboards. In response to the challenges in cell type recognition for hand-drawn tables—given that ex-isting pen-based interaction research has primarily focused on layout analysis and line recognition—this paper proposes a hand-drawn table cell type recognition algorithm based on graph neural networks. First, a spatiotemporal graph attention network is employed to classify sketch strokes into two categories, thereby identifying cell stroke blocks. Then, the spatiotemporal relationships among these stroke blocks are mod-eled. By assigning differentiated attention weights to different nodes, the algorithm effectively captures the temporal and spatial correlations within the table structure, thereby improving the accuracy of cell type recognition. Experimental results on both a user hand-drawn dataset and the public IAMonDo dataset show that, compared with baseline algorithms, the proposed algorithm improves the overall recognition accuracy by 14.20 percentage points on the user hand-drawn dataset and by 13.08 percentage points on the IAMon-Do dataset, demonstrating its effectiveness. Additionally, a sketch table dataset with character-level fi-ne-grained annotations is constructed, providing data support for the recognition and understanding of sketch-based tables.