I tried the latest Apple intelligence features on iOS 18.2 Developer beta and the first thing I felt is I being scammed so hear me out Apple calls it a suite of AI features Apple intelligence but when I press and hold the camera control to do a visual search then tap on the magnifying glass it says searching with Google then it slaps the search results from Google lens on the screen in a small overlay window similarly if I want to ask a question about the image by tapping the ask button it invokes a chat GPT so can anyone tell me where is the Apple intelligence here apple is literally using Google's and open AI smart features and calls it Apple intelligence but this is just the beginning the second problem is you are getting the most basic version of both so let's start with Google lens on the left I'm using Apple Intelligence on the right I'm using Google app to do the same exact visual search on the same iPhone 16 Pro Max with apple intelligence all I get is the search results in a web container and that's pretty much it in Google app I can do the same plus the ability to add extra words to the query to make it more specific like knowing the price check a different color or the nearby stores that sell the same product Etc in this example you can clearly see that Apple intelligence is giving you less while you can get much more features by using the Google app completely free of charge on your iPhone and the same story repeats itself with a Chad GPT when I use it through Apple intelligence all I get is some text on the screen and it doesn't read it back which makes a Chad GPT app a much better option as I can listen to the answer while doing something else so why Apple intelligence is not available on iPhones prior to the 15 Pro it's not even Apple's Technologies Plus Google and the chat GPT offer more features in their own apps which you can install on any iPhone running iOS 15 or later for Google app and iOS 16.4 or later for a Chad GPT another thing that made the whole experience feel more like a scam is only the iPhone 16 models support the visual search and guess what that's because only the iPhone 16 models have the camera control as if Apple cannot put a button on the screen to trigger the same feature on older models but they want to give the useless camera control a fake value by intentionally making it a mandatory feature when it's not again I can download these apps and get even better experience without the need to purchase the iPhone 16 before moving to the next chapter let me remind you about the wallpapers by in-depth th reviews app if you like any of the wallpapers I use in my videos that's where you can find them I released 12 new wallpapers every week it provides you with multiple styling options like the blur brightness and Hue to make your wallpaper stand out with the ability to edit your home and lock screen wallpapers separately sync your fa favorites across all your devices and more the Google Play Store download link is in the description and now let's move on to the next feature so that's it when it comes to the visual search now let's talk about Siri once more Apple used Chad GPT to make Siri look smarter but the overall experience is not great so let me show you a quick demo ask a chat GPT how to make lasagna as you see all it gives me is text on the screen and I have to read it myself and my only option is to copy the text but I cannot listen to it and I have my volume set to the Max and it doesn't read anything back so let's try the Chad GPT app how to make a lasagna making a lasagna is a bit of an art but here's a simple way to do it first preheat your oven to 375° F cook your lasagna noodles according to the package and while they're boiling prepare your meat sauce Brown some ground beef or Italian sauce as you saw Siri acts as an operator who takes my own words and pass it over to Chad GPT plus it doesn't read back the responses which is a huge bummer while the Chad GPT app gives a more intuitive experience as if I'm talking to someone so I can use it while driving cooking or walking so overall when it comes to Siri and the visual intelligence Apple didn't offer anything they didn't only rely on other companies to do their part but they also gave the most basic experience you can get from these Services plus they made them exclusive to the latest and greatest iPhone while any iPhone user can download the third-party apps and enjoy the same experience if not even better other than this I think they did well in other areas like the writing tools the photos app AI features and so on which is something I'm going to talk about in future videos and compare them with the same features offered by Samsung and Google but for now let me know what do you think in the comments do you agree that Apple messed up in Siri and visual intelligence or you think otherwise so thank you so much for watching and see you in the next video