DigitalMe: A Large-Scale Real-World 3D Garment Dataset

DigitalMe garment examples

Overview

DigitalMe is a large-scale real-world 3D garment dataset introduced for realistic garment modelling, reconstruction, and generation. The dataset consists of high-resolution 3D garment meshes extracted from clothed human scans and is used to train Design2Cloth, a 3D garment generative model that produces detailed garments from simple 2D visibility masks.

DigitalMe is designed to address the limitations of synthetic or small-scale garment datasets by providing real-world clothing geometry with realistic wrinkles, folds, and fabric deformations.

Key Features

Metadata

DigitalMe garment examples

Each released garment sample is accompanied by the corresponding SMPL parameter file saved in .npy format. These parameters provide the associated body shape information used for garment alignment and canonicalisation.

The current release focuses on 3D garment geometry and associated body shape parameters. 2D masks, rendered-view annotations, and texture maps are not included in this release.

Data Structure

The dataset is organised in a simple flat structure, where each garment mesh is associated with a corresponding SMPL parameter file. The SMPL parameters are stored separately and share the same file naming convention as the mesh.

mainfolder/
  mesh1.obj
  mesh2.obj
  ...
  meshn.obj

  smpl_params/
    mesh1.npy
    mesh2.npy
    ...
    meshn.npy

Each .obj file represents a 3D garment mesh, while the corresponding .npy file contains the SMPL parameters associated with that garment. Files are matched by filename, for example mesh1.obj corresponds to mesh1.npy.

Recommended Evaluation Protocol

The current dataset release does not include an official train/test split. For reproducible comparison, users are encouraged to define and report a fixed split when using DigitalMe for garment reconstruction or generation.

Metrics

Evaluation Setup

Example Evaluation Code

import trimesh
import numpy as np
from scipy.spatial import cKDTree


def sample_mesh(mesh_path, n_points=10000):
    mesh = trimesh.load(mesh_path, process=False)
    points, face_idx = trimesh.sample.sample_surface(mesh, n_points)
    normals = mesh.face_normals[face_idx]
    return points, normals


def chamfer_distance(points_pred, points_gt):
    tree_gt = cKDTree(points_gt)
    tree_pred = cKDTree(points_pred)

    dist_pred_to_gt, _ = tree_gt.query(points_pred)
    dist_gt_to_pred, _ = tree_pred.query(points_gt)

    return dist_pred_to_gt.mean() + dist_gt_to_pred.mean()


def normal_consistency(points_pred, normals_pred, points_gt, normals_gt):
    tree_gt = cKDTree(points_gt)
    _, idx_pred_to_gt = tree_gt.query(points_pred)
    matched_normals_gt = normals_gt[idx_pred_to_gt]
    consistency = np.abs(np.sum(normals_pred * matched_normals_gt, axis=1))
    return consistency.mean()


pred_points, pred_normals = sample_mesh("predicted.obj")
gt_points, gt_normals = sample_mesh("ground_truth.obj")

cd = chamfer_distance(pred_points, gt_points)
nc = normal_consistency(pred_points, pred_normals, gt_points, gt_normals)

print("Chamfer Distance:", cd)
print("Normal Consistency:", nc)

Possible Benchmark Tasks

  1. 3D garment generation from latent representations
  2. 3D garment reconstruction from user-provided 2D masks
  3. Interpolation between garment styles and shapes

Download

The DigitalMe dataset is available for non-commercial research and educational use. To obtain access, please follow the instructions below.

Please ensure that all usage complies with the license terms described below.

License

Models and data along with their corresponding derivatives are used for non-commercial research and education purposes only. You agree not copy, sell, trade, or exploit the model for any commercial purposes. In any published research using the models or data, you cite the following paper:

BibTeX

@InProceedings{Zheng_2024_CVPR,
      author    = {Zheng, Jiali and Potamias, Rolandos Alexandros and Zafeiriou, Stefanos},
      title     = {Design2Cloth: 3D Cloth Generation from 2D Masks},
      booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
      month     = {June},
      year      = {2024},
      pages     = {1748-1758}
  }