TRENDING ON ONEINDIA
- NIA Officially Takes Over Probe Into Pulwama Terror Attack
- Schalke 2 Manchester City 3 — 10-Man Visitors Strike Late Through Sane & Sterling
- Tata 45X’s Teaser Video Out — Production-Spec Tata 45X To Be Unveiled At Geneva Motor Show
- Cobrapost Sting Operation: Sunny Leone & Sonu Sood Deny All The Allegations!
- Vivo V15 Pro Launched At Rs 28,990 — The Good, Bad & The X factor
- Nutrition: Blood & Marrow Transplant
- Best Places To Visit In India In March: A 2019 Must-visit Checklist
- Company Fixed Deposits In India Which Offer Yields Of Near 10%
Snapchat is testing an Amazon-powered 'visual search' feature
Snapchat is working on this new unreleased feature which is codenamed as Eagle and allows a user to press an hold on the screen for scanning objects.
When it comes to Stories it is Instagram which leads the industry leaving Snapchat and others behind. Now, in order to make its Stories more interesting for the users, Snapchat is aiming toward expanding its scope beyond its famous AR camera effects and converts a user's camera into a visual search tool.
According to a report from Beebom, the new feature was first spotted by an app researcher named Ishan Agarwal. According to him, the code hints that a 'visual search' feature has been found buried in the Snapchat's Android app.
Snapchat is working on this new unreleased feature which is codenamed as Eagle and allows a user to press an hold on the screen for scanning objects and be pointed to a list of Amazon results that are related to the same. This is among a number of features which will be backed into the app.
This feature is described within the app's code and it reads that, "Press and hold to identify an object, song, barcode, and more! This works by sending data to Amazon, Shazam, and other partners." Snapchat currently also allows a user to scan Snap codes and also recognize songs via Shazam, hence it is expected that this feature will be a top-up to the existing functionality.
At present, there is no information available as to how Snapchat's 'visual search' feature will function. However, the hidden code suggests that a user will first need to capture a photo and then the app will pull up Amazon's listings related to everything it sees in a frame. These are further expected to be shown in 'context cards' at the bottom of the screen. Users can click the link from there to purchase a product or share a link with their friends.
Also, if Snapchat is not able to recognize any products in the image then they will see an error message which says 'Bummer, we didn't catch that!' and the user need to scan the image again. In overall functionality, this feature sounds familiar with Google Lens or Pinterest Lens, where the machine learning-powered algorithms help a user to recognize objects.