TL;DR: Apple doesn't allow this yet.
I absolutely agree with you — I wish I could make it paid up front (it would save me a lot of time), but Apple wouldn't approve the app.
The situation is like this: a developer can restrict an app for devices that support certain features (i.e. a developer can make an app available only for devices that support ARKit). My original intent was to restrict the app only for devices with the TrueDepth camera (currently iPhone X, XR, XS, 11, 11 Pro, 12, 12 Pro, iPhone 13, iPhone 13 Pro, iPhone 14, iPhone 14 Pro, iPhone 15, iPhone 15 Pro & their Max / Plus / /mini variants and iPad Pro with FaceID) and without the useless camera part, but it turns out Apple won't let you do this (I've filled bug reports together with other developer and suggested that Apple let us, developers, restrict apps for TrueDepth camera-only devices).
So if you submit an app that is fully working only on the iPhone X and not on other devices, your app gets rejected. This is the reason why I made the app free with the useless camera part that's available on all devices and included the 3D Scanner part as an IAP that can be bought only on supported devices (i.e. on devices with the TrueDepth camera).
Because the app can be dowloaded on non-supported devices (again, no way to restrict it to supported devices), I thought it'd be silly to ask money for an app that just takes photos on devices other than iPhone X — i.e. I didn't want to disappoint non-iPhone X users.
Then you need to scan (i.e. move the phone) more slowly, especially at the very high Precisions, such as 0.5mm, 0.8mm or 1mm. Try playing with the speed of scanning to see how it influences scanning quality. For example when scanning with 0.5mm Precision try moving your iPhone very slowly (e.g. 5mm per second) and then compare it with faster movement — you should see improved quality of scan.
There're 4 ways you can download your models and photos;
Infinite Scanning is experimental feature that lets you scan (theoretically) infinitely large spaces — with this feature enabled, you're no longer limited to ~800MB scan sizes — the only limitation is your free storage on device!
To enable this feature, go to the iOS Settings app > Heges and turn on Infinite Scanning
Please note that this feature requires significant computational power, therefore it is advised that you scan slowly and if possible use the newer devices with Apple's A12 chip and newer (iPhone XS, XR and newer and iPad Pro 2018 with FaceID and newer). However, this feature works perfectly OK on older devices too (i.e. on iPhone X).
You'll need:
How to share screen:
To disable Screen Sharing, double-tap the live video preview again and stop Screen Sharing (do it on both devices). You can also control the video-quality of Screen Sharing: go to the iOS Settigns app > Heges > [adjust the slider]. Screenshot here.
You can start/stop scanning session from the device you are sharing screen to. Example: you have an iPhone XS that you want to scan with and an iPad Air that you want to Share Screen with. After you enable Screen Sharing, you can tap the round toggle button on the iPad to start/stop scanning session on the iPhone.
It is possible to hold a small mirror in front of the TrueDepth camera to make scans, so that you're able to see what you're scanning. 3D Pete made this mirror holder for iPhone XS which can be used for this purpose.
A better-suited alternative (which works for multiple iPhone/iPad models) is this iPad / iPhone TrueDepth camera adapter made by Edu Altamirano.
When you're scanning using a mirror, the models are going to be horizontally flipped. To prevent this, please enable the Mirror-scanning toggle button. As models scanned using mirror are horizontally flipped, enabling this button will horizontally flip models that are being exported into STL/PLY. You don't need to have this option enabled when scanning with mirror — just when exporting scanned models.
I'd recommed first exporting your scanned model into PLY (contains colors) or STL (doesn't contain colors) and then using the non-AR viewer.
Use the Internal format AR viewer only when the exported PLY model is too big to be displayed.
The Heges app's 3D model viewers are meant just for a quick preview of the model — any editing of the scanned models should be done on PC or in another app (I recommend using MeshLab and Autodesk Meshmixer).In version 1.2 of Heges there was added a new underlying scanning technique (and it was made default) that enables you to more easily "rescan" parts of model that have already been scanned and that may contain geometry glitches; to "rescan" a part of model that appears glitchy, just point your iPhone/iPad to the place you want to rescan and stay still for a moment (you should see that the model readjusts). Generally, try experimenting to see for yourself.
If you don't like this new way of scanning (or if it causes scanning session interruptions for you), you can switch to the old scanning technique in the iOS Settings app -> Heges -> turn off "Use New Scanning Method".
The black edges are present due to the new Color Improvement feature. If there would be no Color Improvement, you would see color glitches more frequently (try for yourself how Color Improvement level changes color quality and size of black edges).
To get rid of the black edges, you can turn Color Improvement off in the iOS Settings app -> Heges -> Color Improvement -> Off, or you can change its intensity to either Low, Medium, High, Highest, OP or Über. With Low being the minimal Color Improvement and Über the maximal.
You can go into the "iOS Settings app -> Heges -> Exported Model Units" and then select whether the models should be exported in either meters, centimeters, millimeters, inches or feet.
You can use the iPhone X, XS, XS Max rotating 3D scanner by Andrew Riley.
You can mirror the scanning preview in the iOS Settings app: navigate to: the iOS Settings app > Heges and enable Mirroring preview around vertical axis. Force quit the app if it's open and now the preview is mirrored! Take a look at this screenshot.
Since version 1.3, Heges can inform you when it is having difficulties during scanning by displaying on-screen warning and on iPhones even by vibration feedback.
When you start to feel the vibrations (or see on-screen warning), it means you should start to be more careful — e.g. slow down your movement, get closer to the scanned object (but no more than 10 centimeters or 4 inches) or move back to the spot where you did not get the warning. You should see these warnings only as a suggestion to be more cautious — but it can happen that you will be able to continue scanning just fine even when you get the vibration feedback.
You may want to disable the vibration warning on older devices (e.g. iPhone X) as it can cause performance loss.
You can disable the vibration warning in iOS Settings app -> Heges -> Scanning Warning Vibration Feedback.
I would recommend 2 programs for mesh post-processing and cleaning: MeshLab and Autodesk Meshmixer. Both are free and available on macOS and Windows.
Those are basic post-processing steps that I recommend in MeshLab for STL and PLY files:
r == 0 && g == 0 && b == 0
and click on the Apply
button. Then click on the Close button. This will select all points that are
black.
After you perform all the edits, you can export the post-processed model into a new STL or PLY file by clicking "File > Export Mesh As".
You can of course select faces and vertices in MeshLab and remove them by hand too. If you want to close holes in the model and prepare it for 3D printing in general, you can use Autodesk Meshmixer.
The LiDAR sensor produces lower-quality depth data than the selfie FaceID camera (at least in iOS 14). Using the 0.5 and 0.8 MM Precisions yielded results indistinguishable from the 1 MM Precision, hence 0.5 and 0.8 MM Precisions are disabled when using LiDAR.
The current version of Heges saves color information into the OBJ and PLY files in form of per-vertex colors. If you need to convert them into meshes with JPEG/PNG textures, follow this tutorial:
Scanned models can have a large amount of triangles. The goal is to arrive at an optimized mesh with a lower amount of of triangles with a better topology — this can be done by Instant Meshes. Once you'll have such optimized mesh ready, you can bake / transfer the vertex colors from the raw mesh ("cleaned.ply") into the texture of the optimized mesh ("retopo.ply"). Here is how to do it:
TL;DR: Just enable vertex color rendering in Blender and you will see the per-vertex colors of your scan. See the video at the bottom of this text.
Full answer: the reason why you don't see colors in Blender by default is that Heges stores color per-vertex in the PLY and OBJ files and Blender does not display vertex colors by default. You have to enable vertex colors in Blender to see the color of your scans. Follow this guide to enable vertex colors in Blender: