Everyone is talking about FaceApp - the app that can edit photos of people's faces to show younger or older versions of themselves.
Thousands of people are sharing the results of their own experiments with the app on social media.
But since the face-editing tool went viral in the last few days, some have raised concerns over its terms and conditions.
They argue that the company takes a cavalier approach to users' data - but FaceApp said in a statement most images were deleted from its servers within 48 hours of being uploaded.
The company also said it only ever uploaded photos that users selected for editing and not additional images.
FaceApp is not new. It first hit the headlines two years ago with its "ethnicity filters".
These purported to transform faces of one ethnicity into another - a feature that sparked a backlash and was quickly dropped.
The app can, however, turn blank or grumpy expressions into smiling ones. And it can tweak make-up styles.
This is done with the help of artificial intelligence (AI). An algorithm takes the input picture of your face and adjusts it based on other imagery.
This makes it possible to insert a toothy smile, for instance, while adjusting lines around the mouth, chin and cheeks for a natural look.
So what's the problem?
Eyebrows were raised lately when app developer Joshua Nozzi tweeted that FaceApp was uploading troves of photos from people's smartphones without asking permission.
However, a French cyber-security researcher who uses the pseudonym Elliot Alderson investigated Mr Nozzi's claims.
He found that no such bulk uploading was going on - FaceApp was only taking the specific photos users decided to submit.
FaceApp also confirmed to the BBC that only the user-submitted photo is uploaded.
What about facial recognition?
Others have speculated that FaceApp may use data gathered from user photos to train facial recognition algorithms.
This can be done even after the photos themselves are deleted because measurements of features on a person's face can be extracted and used for such purposes.
"No, we don't use photos for facial recognition training," the firm's chief executive, Yaroslav Goncharov told BBC News. "Only for editing pictures."
Is that it?
Not quite. Some question why FaceApp needs to upload photos at all when the app could in theory just process images locally on smartphones rather than send them to the cloud.
In FaceApp's case, the server that stores user photos is located in the US. FaceApp itself is a Russian company with offices in St Petersburg.
Cyber-security researcher Jane Manchun Wong tweeted that this may simply give FaceApp a competitive advantage - it is harder for others developing similar apps to see how the algorithms work.
What else does FaceApp have to say?
Mr Goncharov shared a company statement that said FaceApp only uploads photos selected by users for editing. "We never transfer any other images," the statement added.
"We might store an uploaded photo in the cloud.
"The main reason for that is performance and traffic: we want to make sure that the user doesn't upload the photo repeatedly for every edit operation.
"Most images are deleted from our servers within 48 hours from the upload date."
The statement said that while FaceApp accepts requests from users to have their data deleted, the company's support team was currently "overloaded".
FaceApp advises users to submit such requests through settings, support, "report a bug" and add "privacy" in the subject line.
User data was not transferred to Russia, the statement added.
The UK's Information Commissioner's Office (ICO) told BBC News it was aware of stories raising concerns about FaceApp and that it would be considering them.
"We would advise people signing up to any app to check what will happen to their personal information and not to provide any personal details until they are clear about how they will be used," a spokeswoman for the ICO said.
READ ALSO:Â
Source: BBC