![]() In a real-life scenario, you should also have optimal lighting conditions, a tripod, and a mechanism to automatically rotate the object without changing its position. You need about 30 photos to create a 3D model, but Apple recommends using many more than that to get a high-quality result. With the example app provided by Apple, I had to capture multiple images of the object at different angles, so the API can then render the 3D object in 360 degrees. I used my iPhone 12 Pro Max running iOS 15 beta for this demonstration. Requirementsįirst, you need an iPhone or iPad with a dual-lens rear camera (and preferably a LiDAR scanner, although not required) to capture depth data. While there is still no app available in the App Store with this new feature, Apple provides some examples of how to compile an app using this new API, and of course I had to test it myself. With the Object Capture API, Apple says that this whole process of capturing and rendering 3D models will only take a few minutes now. But that changes with macOS Monterey and iOS 15. Usually, you would need advanced cameras to take 3D captures, and then render them all in a dedicated software. One of the new APIs is “Object Capture,” which will let users easily create 3D models of any object using the iPhone camera.Īpple has been pushing the adoption of AR technologies for a while now, but creating a 3D model may not be the easiest thing in the world for some people. ![]() 13/15" MBPs were updated May, then the 16" replaced the 15" a few months later in November while the 13" continued on unchanged.While macOS Monterey (also known as macOS 12) brings several new features for users, the update also comes with significant improvements for developers with brand-new APIs that enable new possibilities for third-party apps. I also vaguely recalled the 15" to 16" MBP transition seeming sudden and looked that up. I remember the first gen Intel Core 1 Macs not being long for the world. Looks like the iBooks (only 12" at the time) were updated in May, October, added the 14" in the following January, then bumped both 12" and 14" in May. The screen size matters way more than the SoC there, so they could get a decent chunk of sales and benefit from a lower SoC cost before updating it to the M3 later on.Īs for others, I was curious about the 14" iBook and checked EveryMac. ![]() Lower end big Apple laptops haven't existed for years, and that segment seems like a significant chunk of the PC side of things. If someone just wants a bigger screen they don't want the 13/14" MacBooks, and unless they're willing to spend $2500+ on the 16" MBP, they're boned in the Mac ecosystem. Nice.Ĭlick to expand.There's been a big gap for a while now. The hologram's dynamic shadow map even shows hints of color, giving the effect of indirect illumination. I kinda love that fact that Apple put more attention into their hologram AR invite than Meta did its own reveal video. This is what Reality will be compared to, and we're less than 20 hours from that compare. (color me disappointed) The only model showing a modicum of detail is the static dungeon pasted to a tabletop, natch. The virtual objects pick up nothing from their environment, leaving the impression of an overlay in ink and paint… like a cartoon. ![]() Onto the visuals, what… the… frack? Cartoonish in the worst way flat and dimensionless. But as a video generated by Meta to promote their own device, we need to take it at face value IMO. Could this be a function of the capture method? Sure. Two things strike me: the rendering quality and the framerate – both are garbage. Zuck posted a preview of the MR functionality of Quest 3. ![]()
0 Comments
Leave a Reply. |