Skip to content
/  ...  /  
shap-e  /  
/ shap-e Public
  • Watch 95

    Notifications

    Get push notifications on iOS or Android.
  • Fork 49 Fork your own copy of openai/shap-e
Watch

Notifications

Get push notifications on iOS or Android.
Open in github.dev Open in a new github.dev tab Open in codespace

openai/shap-e

main
Switch branches/tags
Go to file
Add file
Code

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
May 3, 2023 17:20
May 3, 2023 17:20
May 3, 2023 17:20
May 4, 2023 17:12
May 4, 2023 21:25
May 3, 2023 17:20

Shap-E

This is the official code and model release for Shap-E: Generating Conditional 3D Implicit Functions.

  • See Usage for guidance on how to use this repository.
  • See Samples for examples of what our text-conditional model can generate.

Samples

Here are some highlighted samples from our text-conditional model. For random samples on selected prompts, see samples.md.

A chair that looks like an avocado An airplane that looks like a banana A spaceship
A chair that looks
like an avocado
An airplane that looks
like a banana
A spaceship
A birthday cupcake A chair that looks like a tree A green boot
A birthday cupcake A chair that looks
like a tree
A green boot
A penguin Ube ice cream cone A bowl of vegetables
A penguin Ube ice cream cone A bowl of vegetables

Usage

Install with pip install -e ..

To get started with examples, see the following notebooks:

  • sample_text_to_3d.ipynb - sample a 3D model, conditioned on a text prompt
  • sample_image_to_3d.ipynb - sample a 3D model, conditioned on an synthetic view image.
  • encode_model.ipynb - loads a 3D model or a trimesh, creates a batch of multiview renders and a point cloud, encodes them into a latent, and renders it back. For this to work, install Blender version 3.3.1 or higher, and set the environment variable BLENDER_PATH to the path of the Blender executable.