Placing 3D Objects in Real-world and Enabling AR with SwiftUI
Kinjal P
Apple declared at the last WWDC, “The best approach to craft an application is with Swift and SwiftUI.” While the language is united in the community, the recent framework for producing interfaces SwiftUI is at the core of the discussion. Its benefits are many. However, it is made more for the future than the current-day scenarios.
What is SwiftUI?
SwiftUI is a development framework for creating user interfaces. SwiftUI assists you in developing good-looking applications across iOS platforms with the use of Swift and significantly less code. You can bring the best user experiences on any of the Apple devices, leveraging merely one set of applied tools with APIs.
How does SwiftUI Work?
SwiftUI utilizes a precise declarative syntax, so you can easily state what your user interface needs to do. For instance, you can write that you want a list of products built with text fields, then state alignment, required font, and color for all types of involved areas.
How Can You Use SwiftUI for Placing 3D Objects in the Real World?
SwiftUI is developed to work along UIKit and AppKit in real-world scenarios. You can enable it incrementally in your prevailing apps. When it is time to build a new part of your user interface (UI) or rebuild a prevailing one, you can leverage SwiftUI while keeping the rest of your code base.
When you recompile SwiftUI apps, you can add 3D objects or showcase volumes. If you want to leverage an interface component that is not provided in SwiftUI, you can blend SwiftUI with UIKit and AppKit to take benefits on both sides.
With Model3DView, you can easily place and display 3D objects in your SwiftUI application like you would with images. You can manipulate the camera and change the model, all while keeping stuff SwiftUI-friendly.
SwiftUI’s rotation3D effect modifier allows rotating views in 3D space to craft stunning effects in nearly no code. It works on two explicit factors: first, what angle to rotate, and second a tuple comprising the X, Y, and Z axis about which to execute the rotations.
Read More:- Create a 3D Floor Plan of an Interior Room Using RoomPlan
Key Benefits of Using SwiftUI
SwiftUI has many returns, one of which is that it makes UI development faster and easier. Programmers can declare how the user interface should look and function without having to offer a sequence of processes, as SwiftUI utilizes a declarative syntax. It is faster to build and straightforward to manage as an outcome.
- Simplicity and Proximity
- Greater Interoperability
- Engaging and Interactive Canvas
The closeness of SwiftUI to other frameworks like React, Flutter, and Jetpack Compose welcome programmers of other platforms to attempt native development on iOS platforms.
Interoperability is one of the key assets of SwiftUI. Its use can be advanced, as it interfaces flawlessly with UIKit and AppKit.
Its deep blend with Xcode makes it an asset. With Previews, it is likely to effortlessly visualize the rendering of a precise SwiftUI view. Any alteration in the code instantly reloads the view, which can be intermingled and directly interacted with. You can also have a concurrent rendering on diverse screen configurations.
Technostacks Has Successfully Developed iOS App with SwiftUI
We at Technostacks, a leading mobile app development company, have successfully created an application with SwiftUI in which you can add 3D objects.
When you open this demo app, first it will start detecting the horizontal plane, in a programming language, it is called “ARCoachingOverlayView”. It is designed to help users position the device and move around to get a better AR experience.
The ARCoachingOverlayView provides visual instructions and tips to guide users through finding and interacting with AR content more effectively. Once it detects the particular plane, we can place objects in the real world.
- Pressing the rocket button will add the 3D model of a rocket to the screen. We have created this model with the help of “Reality Composer.”
- Reality Composer is a powerful tool and framework developed by Apple that allows developers and designers to create augmented reality (AR) experiences without requiring extensive knowledge of 3D modeling or programming. It is part of Apple’s ARKit framework for building AR applications on iOS devices.
- You can change the size of the rocket, and when you tap/click that rocket, it will start flying. This animation or behavior we have added in reality composer. We can customize everything in this. We have also added audio when it starts flying.
- There is another option with a Blue box-like button. When you press that button, the 3D box with blue color will be added to the screen. You can rotate the box and change the position and size of this box.
And lastly, when you press the “Delete” button, all the objects we have added will be deleted.
Implementation of the App with SwiftUI
No built-in support is available for creating Augmented Reality (AR) views in SwiftUI. However, we can make our own custom AR view with the help of UIViewRepresentable, available in SwiftUI.
Before starting coding, first, we need to add permission for camera usage in “Info.plist”.
Now, create one separate file “CustomARViewRepresentable” like below in which we will return our custom ARView.
//MARK: - Create custom ARView with UIViewRepresentable struct CustomARViewRepresentable: UIViewRepresentable { func makeUIView (context: Context) -> some UIView { return CustomARView() } func updateUIView(_ uiView: UIViewType, context: Context) { } }
For better understanding and code maintenance, we have created a separate file that represents all tasks related to ARView like this.
//MARK: - Custom ARView class CustomARView: ARView, ARSCHViewDelegate { let coachingOverlay = ARCoachingOverlayView() required init(frame frameRect: CGRect) { super.init (frame: frameRect) dynamic required init?(coder decoder: NSCoder) { fatalError ("init (coder:) has not been implemented" ) convenience init() ‹ self.init(frame: UIScreen.main.bounds) //Add ARCoachingOverlay View self.addCoaching () let config = ARWorldTrackingConfiguration () config.planeDetection = .horizontal self.session.run (config, options: []) //Subscribe for user events subscribeActionStream() }
We have added an ARCoachingOverlay view for detecting a horizontal plane like below.
extension CustomARView: ARCoachingOverlayViewDelegate { fune addCoaching () { coachingOverlay. delegate = self coachingoverlay.session = self.session coachingoverlay.autoresizingMask = [. flexiblewidth, .flexibleHeight] coachingoverlay.center = self.center coachingOverlay.goal = .horizontalPlane self.addSubview (coaching0verlay) public fune coachingOverlayViewDidDeactivate(_ coachingOverlayView: ARCoachingOverlayView) { // Method will called when coaching goal will be detected print ("plane detected") coachingOverlayView.removeFromSuperview() } }
Now, add the “CustomARViewRepresentable” View in ContentView. We have added 3 buttons at the bottom of the screen to perform different actions.
struct ContentView: View { @StateObject private var viewModel = ARCoachingViewMode1 () var body: some View { CustomARViewRepresentable() .ignoresSafeArea () .overlay (alignment: bottom) { ScrollView (.horizontal) ( HStack { Button {...} label: {...} Button {...} label: {...} Button {...} label: {...} ).padding () } } } }
Add an enum for user action and one singleton class for defining our publisher which will be subscribed by CustomARView.
enum ARAction { case placeBox (color: Color) case removeA11Anchors case add3DModel }
class ARManager { static let shared = ARManager () var actionStream = PassthroughSubject() }
Now, Add the subscription method in the custom class. And call functions according to actions.
//MARK: - Perform action according to user click func subscribeActionStream () { ARManager. shared. actionStream .sink { [weak self] action in switch action { //Add 30 Box with specific color case .placeBo(let color): self?.placeBlock(ofColor: color) //Remove all objects case . removeAllAnchors: self?.scene. anchors. removeA11() //Add 3D model case add3DModel: do { let object = try Rocket.loadScene() self?.scene. addAnchor (object) self?. rocket = object self?.rocket?.rocket?.generateCollisionShapes (recursive: true) let box = self?. rocket?.rocket as? Entity & HasCollision if let box = box { self?.installGestures (.rotation, «scale, •translation], for: box) ) catch { print (error) } } } .store(in: &cancellable) }
There are three actions:
- First, for adding a 3D box,
- The second is for removing all 3D objects,
- And third for adding a rocket 3D model.
Rocket is a 3D model created with Reality Composer. First, we loaded the Rocket object and then added gestures. However, you can change the size of an object.
Below is the function for placing a 3D box. After placing it, you can perform every action on it, like rotation, scaling, and translation.
func placeBlock(ofColor color: Color) { let box = MeshResource.generateBox(size: 0.1) let material = SimpleMaterial (color: UIColor (color), isMetallic: true) let entity = ModelEntity (mesh: box, materials: [material]) let anchor = AnchorEntity (plane: horizontal) anchor. addChild(entity) scene.addAnchor (anchor) entity.generateCollisionShapes (recursive: true) self.installGestures ([.rotation, .scale, .translation], for: entity) }
Conclusion
SwiftUI presents the future of advanced programming and development on all Apple platforms. However, even currently, SwiftUI offers the following benefits:
- Its robust interoperability with UIKit and AppKit allows a progressive but simple installation.
- Its pattern of declarative interface formation and explicit compatibility with reactive coding make it a modern-day and highly useful framework to leverage.
- Along with greater future benefits, it is even currently needed for precisely enabling iOS widgets.
In this article, we have discussed how our development teams at Technostacks have leveraged SwiftUI and successfully implemented it on a real-time basis for clients. Contact us to create such kinds of modern and real-world augmented reality applications or if you have an app idea for something related.