The ads are enticing and the websites look legit. Where to look for the red flags.
GREENSBORO, N.C. — Are you looking for a last-minute getaway? One of the most common things to do is to Google, “last-minute deals”.
“Google doesn’t always take them to the right place. And there are also misleading ads that are on social media,” said Yoav Keren, Brandshield co-founder and CEO.
Maybe you’ve seen it on Instagram or Facebook, some kind of ad that says they have rooms for half off or flights for cheap.
“You might be directed to a bogus site where scammers are trying to phish you and take your credentials and they’re not selling you a real vacation,” said Keren.
When you see those ads, Keren says you need to take another step.
“If the deal is real, it will be on the main site for the company. So don’t click the link in the ad, instead go to the company that is providing the trip, house, etc., and make sure the deal is real,” said Keren.
These scam websites are getting better and better, for the scammers that is! Take a look at the three sites in this story. They all look legit, but they’re all fake. They appear to be Delta, Jet Blue, and AirB&B.
“When you’re going to a website, look at the domain name, the URL, and make sure it doesn’t have typos.
The real Delta site for example is delta.com
But the fake website’s address is similar enough that you might not catch it: delta-aairlines.org
Keren warns, that in many of these cases, you can’t get your money back, especially since some of these sites ask for you to wire money or use Venmo, Cash App, or Zelle. Those P2P cash apps don’t work like a credit card or bank. Once the money is gone, it’s gone.
A good rule of thumb is to always use a credit card. This way, you can call the credit card company and get the charges reversed. When you use a debit card, the money comes out of your bank account and may be harder to get back.
At music festivals across the U.S. this summer, there’s a booth drawing a lot of attention. Away from the stage of stars, it’s not food or drinks, but it is free. Between sets of their favourite bands, fans can get the tools and knowledge to reverse an opioid overdose.
“We have to target people who might not think that something like this will happen to them and we found the best place to reach those people in an open minded state is through music and arts events,” said William Perry, co-founder of This Must Be The Place. The non-profit, based in Columbus, Ohio is nearly two years old and making waves across the U.S. in hopes of helping and healing people impacted by the unprecedented opioid epidemic.
“People become a little bit more open minded to the fact that their friends or just strangers around them might be in danger and they do have this ability to help them,” said Perry.
This Must Be The Place was a name that could have been temporary, but it stuck. “Wherever we go, whatever festival we’re at, this must be the place where you can get free Naloxone,” said Ingela Travers-Hayward, co-founder of the organization and Perry’s wife, originally from Toronto.
The couple say they set out to do whatever the health units were not. They wanted to meet people in a space where they felt comfortable and could talk in an open setting.
With no pharmacists behind the booth, they have a team with varying lifestyles and backgrounds, ready to connect with concert goers. With their pet Corgi as a mascot, casual clothing and a unique sense of style, they easily attract a crowd to their booth.
“They’re going to a festival, they’re not expecting to see free Naloxone and we want to make it so when they see that word, they feel really comfortable to come up to the booth. They feel like, ‘OK, let me go investigate,’” said Travers-Hayward.
The non-profit works to normalize the conversation about drugs, while teaching people how to use Naloxone, a medication used to reverse the effects of opioids.
This Must Be The Place co-founders Ingela Travers-Hayward and William Perry never imaged they’d be running a non-profit and handing out Naloxone, but they’re inspired by what they’ve accomplished ― providing people with the critical tools they need to help save a life. (Submitted: This Must Be The Place)
In two years, the organization has handed out more than 30,000 kits, donated by a pharmaceutical company. They’re now reaching a critical milestone, giving Naloxone to roughly one-in-20 people at a festival. “When you’re down in a crowd or down in the pit watching your favourite band, your 360 line of sight, and the people you can get to is roughly 20 to 25 people,” said Perry.
Street drugs have become increasingly tainted with powerful opioids, such a fentanyl, a deadly reality hundreds of thousands have faced over the years. Perry has seen the effect of the opioid epidemic first hand. He spent a large part of his life struggling with addiction.
“People, much older than me gave me … some drugs when I was really young and they forgot to tell me, ‘Oh yeah, by the way, you’re going to be addicted to this and you’re gonna need this every single day,’” Perry said, opening about his past.
Pausing to collect his thoughts, Perry takes a deep breath before saying “and so I didn’t know any better until I was caught up in it and so were all of my friends. Unfortunately, the majority of them aren’t with us anymore and that’s because of how long this has lasted and how dangerous it’s become.”
Perry now has five and a half years of sobriety under his belt and acts as a sober recovery counsellor, determined to help others. “There’s absolutely nothing I can do to bring back the friends that I’ve lost, but there is something that I can do to hopefully prevent someone else from going through that pain.”
The people who come by their booth range in age ― anyone from teenagers to people in their sixties. Some admit they themselves have never had experience with drugs, but everyone is happy to take a kit.
“So many people who take this have no experience with an overdose before, so it’s just like having to tell them, ‘It’s going to be scary, it’s going to be intense but you’re going to get through it ― just one little thing and you’re going to be able to save a life.’ That’s a huge comfort to people,” said Travers-Hayward.
A naloxone kit is shown in Vancouver, B.C., on Monday November 13, 2017. (Darryl Dyck/THE CANADIAN PRESS)
While festivals are often known as a place for drugs, the organization is not only targeting that market, but their communities. The group has heard stories of the Naloxone handed out and being used not long after, but the kits that don’t get used go home in the hands of people willing to use them, with the skills to do so.
The toxic supply of street drugs is affecting more than those who are “typical” users.
“Even though maybe I am not a user, (it’s possible) somebody at my work is doing a mid-afternoon pick me up to get me through the day. You know, that’s possible,” said Perry. “And what we know about this opioid supply, that means that person at your job is highly at risk, which means you should probably have one of these at least somewhere in the break room, at your desk or whatever.”
This Must Be the Place is that opportunity for someone to get equipped.
“It’s everywhere now, and you can’t really live in this way of saying it’s not going to impact me. Ideally, it doesn’t … ideally, you never see someone overdosing or you don’t know someone who has overdosed, but you’re not as immune from it as you once were,” said Travers-Hayward.
The two come from vastly different backgrounds and met while Travers-Hayward was researching for a documentary in Columbus, Ohio. The documentary never came to fruition and neither imaged they’d be running a non-profit and handing out Naloxone, but they’re inspired by what they’ve accomplished ― providing people with the critical tools they need to help save a life.
Love on the Lawn is being presented for free this year in downtown Elgin, and music producer Mike Page hopes his hometown comes out to support the hip-hop and R&B music festival.
“I want everyone to have a good time,” said Page, owner of Impossible Dreamz Music Group in Atlanta. “I’m looking forward to seeing people dance and getting a lot of hugs and a lot of love.”
By doing away with the admission cost, he said he thinks they’ll be able to increase attendance from 700 in 2022 to 1,000 this year as the event returns for a second year.
“It’s going to be way more lively,” Page said of the event, which he’s putting together in conjunction with the city of Elgin. It’s set for 2 to 10 p.m. at Festival Park.
New this year will be a series of afterparties being held at area bars and restaurants to keep the energy going, Page said.
Jennifer Fukala, executive director of the Downtown Neighborhood Association of Elgin, worked with Page on promoting the event. Five restaurants have agreed to offer drink specials and offer events following the festival, she said.
Banners at the festival will list the places people can go when the park entertainment ends.
“There’s a lot of interest and appetite for more nightlife in downtown Elgin,” Fukala said.
The Downtown Elgin Farmers Market has had a lot of success in drawing crowds to surrounding bars and restaurants and she thinks this could as well, she said.
“This being a music festival, it’s another opportunity to promote nightlife,” she said.
Love on the Lawn will feature local favorites like DJ Charles Wells, Rodney Blalark and up-and-coming DJ Weedy Roszay opening for headliner Bobby V, an R&B singer formerly known as Bobby Valentino. He’s worked with such artists as Nicki Minaj and Lil Wayne.
Sundance, a V103 FM host, will also be participating. “Her energy is great. People loved her last year,” Page said.
There will be more vendors this year, mostly black and Hispanic-owned businesses, he said.
Page’s dream, he said, is to have the festival be as big as Chicago’s Lollapalooza and to put Elgin on the map as a destination over Labor Day weekend.
“I want people to know Elgin is special,” he said. “We can make Elgin the hub of everything. Elgin is going to be the place.”
Fukala sees a lot of potential in Love on the Lawn too.
“Chicago being the birthplace of house music, there are a lot of people in this area who have a soft spot for that genre of music. It could be a great opportunity for Elgin to become a destination for Labor Day weekend,” she said.
Impossible Dreamz Music Group is donating 10% of its profits from the kid’s zone part of the event to the Elgin Northwest Tide Youth Football team.
Page said that while he may be living in Atlanta, he comes to Elgin to visit family and friends often and he wants to do right by the city.
“Elgin is everything to me. I love my town,” he said.
Gloria Casas is a freelance reporter for The Courier-News.
We begin Arizona Encore‘s 2023-2024 season with music from the Grand Canyon Music Festival. On this episode of Arizona Encore, we’ll celebrate the festival’s 40th anniversary. What started almost as a fluke flute performance during a canyon hike in 1982 has now become a forward-thinking institution with equal commitments to music education and to outstanding performances by professional musicians.
String quintets, quartets and trios recorded at the 2022 festival will show off the diversity and breadth of classical music composed around the turn of the 20th century, but from different positions around the world. The Manhattan Chamber Players and the Catalyst Quartet recorded these performances last year and both ensembles will return to the Shrine of the Ages this weekend for special performances in honor of the festival’s 40th anniversary season.
Enjoy a selection from the Manhattan Chamber Players’ performance at the Grand Canyon Music Festival last year on September 3rd as we count down to the first broadcast of Arizona Encore‘s 2023-2024 season.
This August, the University of Oregon community experienced a perilous seasonal phenomenon almost as familiar as sunshine and high temperatures: a flood of phishing emails.
Phishing scams attempt to trick the recipient into sharing sensitive information or establishing a relationship with a cybercriminal, who then proceeds to steal money, identities or intellectual property or gain unauthorized access to UO systems and data.
The phishing lures that proliferated most at the UO this August shared a similar approach. Both used tantalizing offers to entice UO community members to communicate via non-UO systems.
“If something seems too good to be true, it probably is,” said José Domínguez, interim chief information security officer. “Don’t take their bait.”
Cybercriminals target universities at the start of terms and over breaks, when people’s routines are disrupted. The fall surge of phishing emails typically starts in August, when semester schools start.
The latest phishing campaigns
In August, the Information Security Office saw two phishing scams proliferate.
One purported to offer several valuable musical instruments for free except for shipping costs. Another claimed to offer a personal assistant job paying more than $100 per hour.
Both campaigns were designed to move conversations off UO systems, beyond the scope of UO security measures. The first campaign asked people to send a text message to a phone number. The second asked the prospective job applicant to provide personal information through a form in Google Forms.
That exemplifies a trend the Information Security Office has observed in recent years. As the UO has increased the security of its own systems, cybercriminals seek to lure UO students, staff and faculty members into personal email, text and chat conversations, where UO systems can’t track malicious activity, as mentioned in 2021 and 2022.
However, the overall framework of the recent scams follows much older patterns of offering outsized rewards for little effort. Such scams can result in actual financial losses for individuals at the UO and elsewhere.
“We understand the temptation. Who doesn’t want to get something for almost nothing?” Domínguez said. “However, that temptation should raise a red flag for you. The only person who will come out ahead is the scammer.”
To distribute such malicious emails, the attackers often use a small handful of UO accounts that have already been compromised. The Information Security Office strives to identify such compromises quickly to deter further attacks and abuse.
To that end, Domínguez strongly encouraged all members of the UO community to report suspicious emails through the Report Phish button in Outlook or by forwarding them to firstname.lastname@example.org.
“Your reports help our team move faster to protect you and the rest of the university,” he said.
Domínguez also encouraged people to learn more about how to protect themselves from phishing attacks.
“We can stop or subvert about 99.6 percent of malicious email messages,” he said. “We are asking for your help with the 0.4 percent of messages that are not immediately detected.”
How to protect yourself
When in doubt about a message, UO community members can:
The Information Security Office offers the following tips for staying safe from phishing messages:
Beware of tantalizing offers. If it seems too good to be true, it probably is.
Don’t click links in suspicious messages.
Don’t share confidential information, yours or the university’s.
Beware of attachments. To avoid malicious software, or malware, delete any message with an attachment unless you’re expecting it and are absolutely certain it’s legitimate.
Be wary of suspicious emails from UO accounts. Cybercriminals often distribute phishing messages from accounts they’ve compromised.
Confirm identities. Cybercriminals often impersonate schools, financial institutions, health authorities, retailers and a range of other service providers by using official-looking logos and similar email addresses and URLs.
Deny unexpected Duo requests. If you receive a Duo verification request when you’re not logging into a Duo-protected UO service, tap “Deny” in the Duo Mobile app or 9 on a Duo phone call. Then confirm the login was suspicious to alert UO staff.
Keep your computer and other devices up to date. Those software and system updates often fix security gaps.
Information Services offers more tips to help determine if a suspicious email is malicious, as does the Federal Trade Commission.
All UO employees, including graduate employees and student employees, also can take the UO Cybersecurity Basics training to learn more about protecting accounts and devices.
If you’ve responded to phishing
Anyone who has responded to a suspicious email should immediately contact email@example.com and then consider the following next steps, depending on the situation:
Entered Duck ID and password on a fake website? Go to Duck ID Account Management, change your password and revise security questions and answers.
Entered UO ID number, also known as a 95 number, and corresponding password, or PAC, on a fake DuckWeb site? Go to DuckWeb, change the PAC and verify that no important information has been changed.
Believe you’re the victim of an online crime, such as identity theft? Report it to UOPD at 541-346-2919 or online, no matter how minor it may seem. Identity theft happens when someone steals your personal information, such as your Social Security number, and uses it to obtain credit cards or loans or commit another form of fraud in your name.
To protect phishing victims, the Information Security Office will temporarily disable the account of anyone who has clicked a malicious link and potentially entered their credentials. To restore account access, users should contact the Technology Service Desk by phone at 541-346-4357 or by live chat.
New York, NY (Top40 Charts) Gearing up for her biggest and most personal season yet, multi-platinum recording artist Coi Leray presents a dynamic new EP entitled Blue Moon out now via Uptown Records/Republic Records.
Once again, the project finds Coi firing on all cylinders, using her songwriting as a diary to reveal her unique perspective on life. She laces highly quotable bars with irresistible hooks, showcasing her lyrical acumen and impressive vocal delivery at the same time. She opens up the world of Blue Moon with the single and music video “Isabel Marant.”
Ethereal production underscores her breathy verses as she reminds, “They can’t f with Coi Leray.” It culminates on a hypnotic hook that’s impossible to shake. The accompanying visual intercuts vignettes of Coi underwater, shimmering in blue glitter, and ruling this fever dream like a bossed-up high fashion mermaid.
Connecting deeper with her fans, she passionately shared on Instagram, “This EP, I get vulnerable. I got tired of “trying” new things and wanted to just do what I do BEST. I have a hard time with telling my story because I get wrapped up in the media narratives, however music is the best way for me to tell it. Sorry I can’t argue with yall on the internet all day. It’s my life, my story, so why not let me tell it? s.. at least give me a chance to.
The amount of pressure I have on me in the music industry is wild. I mean, who knew I would be walking red carpets and stages with the same icons I listened to growing up. Being nominated or sitting next to ICONS on the Billboard charts in my lil 5-year run. I’m just grateful to have come this far. Just like @bustarhymes said, “who am I to judge someone who’s journey isn’t finished?” Enjoy – Coi”
It just continues a prolific summer for the superstar. She recently dropped her self-titled album, COI. It included the multiplatinum mega-smash “Players” and earned unanimous critical acclaim. HYPEBEAST proclaimed, “Leray is ready to prove herself once again on the new COI,” and HipHopDX raved, “Leray returns triumphant on sophomore effort COI – a follow-up that strips out the excessive features and autotune, and returns the focus to Leray’s eclectic set of influences, talent for pop hooks, and sex-positive flow.” HotNewHipHop also attested, “This new project features 16 tracks that are both catchy and engaging.”
Simultaneously, she has continued to make waves in the fashion world as one of the culture’s hottest crossover stars. Her presence has already proven inescapable, appearing on one cover after another for the likes of TOP40-CHARTS, NUMERO, PAPER, UPROXX, and EUPHORIA. Not to mention, she has shined in features by W, V Magazine, Interview, Elle, i-D, and more with high-fashion spreads. All of this just sets the stage for more moves in fashion as she cements herself as a style icon.
It’s Blue Moon season now!
Coi Leray consistently asserts herself as an elite musician, dynamite vocalist, massive personality, and boundary-breaking superstar without comparison or rival. Born in Boston, raised in New Jersey and based in Los Angeles, the multiplatinum force of nature has captivated without compromise.
Coi has served up a string of anthems from her debut album TRENDSETTER, including the double-platinum “No More Parties,” gold-certified “Big Purr (Prrrd)” [feat. Pooh Shiesty], the Billboard Hot 100 Top 40 hit “Blick Blick” [with Nicki Minaj], and the viral “TWINNEM.”
Coi has also earned her first Billboard Hot 100 Top 10 with her highest charting single to date “Players” landing at #9. It’s taking over social media with popular remixes such as “Players (David Guetta Remix),” “Players (Tokischa Remix),” “Players (DJ Smallz 732 Jersey Club Remix),” “Players (DJ Saige Remix),” and “Players (DJ Saige Remix)” feat. Busta Rhymes.
Together they set TikTok on fire with a staggering 10 billion views and nearly 3 million video creates as the success simultaneously translated to streaming platforms. It’s also dominating the radio with three consecutive weeks at #1 on Rhythmic Radio, #1 on Urban Radio, #1 on the Billboard Hot Rap Songs Chart, and #6 on Billboard’s Radio Chart.
This set the stage for her critically acclaimed full-length album, Coi, which arrived in June 2023. Coi Leray continues to cross-over into other genres with the hit song “Baby Don’t Hurt Me” with David Guetta and Anne-Marie.
Then there’s her blockbuster R&B anthem “Self Love” with Metro Boomin which was a highlight on the Spider-Man: Across the Spider-Verse soundtrack. Between generating billions of streams and views, she has infiltrated every corner of culture, appearing on XXL’s coveted Freshman Class cover, performing on NBC’s The Tonight Show Starring Jimmy Fallon, as well as ABC’s Jimmy Kimmel Live!, starring as the face of the SKIMS “Cozy Collection” by Kim Kardashian West, collaborations with major fashion brands such as Kenzo, Moschino, Fendi By Marc Jacobs, YSL Beauty, as well as guesting on Slime Language 2, which bowed at #1 on the Billboard 200.
Coi Leray’s reach continues to expand thanks to collaborations with everyone from Calvin Harris, RAYE, K Pop’s TOMORROW X TOGETHER and more. Beyond a slew of magazine covers and nominations at the American Music Awards, BET Awards, and iHeart Radio Music Awards, to name a few, she has incited critical acclaim from New York Times and Billboard to E! Online and more.
As a powerhouse performer in her own right, she has shined at Rolling Loud, Lollapalooza Festival, Governors Ball and more. Still, she keeps pushing forward. The crown belongs to Coi as she releases her anxiously awaited EP, Blue Moon, and much more to come.
The day before the most electrifying weekend of the Irish calendar kicks off, Hot Press are announcing the line-up for the iconic Hot Press Chat Room 2023!
The team are prepped, wellies in tow, to chat to some of the creme of the crop of Irish and international musicians, and we’re so excited to tell you all about it.
Lucy O’Toole will kick of the weekend at the Mindfield, with King Kong Company the first act up to the plate! King Kong Company are an Irish dance band hailing from Waterford. The group met and started while attending Waterford Institute of Technology, however, a majority of the group aren’t actually from the county at all! Currently, their line-up consists of Mark White (bass guitar), Colin Hoye (trumpet), Tom Stapleton (keyboard), Trish Murphy (Dancer & Choreography), Aaron Mulhall (Drums), and Owen Corrigan (Guitar). The group have been together, in some formation, since 1996, taking a break for 11 years – between when the members finished college and reformed in 2011. King Kong Company are best known for their self-titled album released in 2016 and their quirky stage performances.
Up next, Jess Murray will be chatting to David Keenan. The Dundalk musician and artist is no stranger to the picnic, or Hot Press for that matter, with a career in music spanning far longer than his baby-face would suggest. Fresh off the back of a brand new single, and protesting how the government are handling the housing crisis in front of Dáil Éireann, we might even belt out a ‘happy birthday’ for the lad, who gave up his celebrations to stand up for what he believes in.
David Keenan @WildRoots Day 2
Irish post-punk band, Murder Capital, will chat to Lucy at approximately 5:30pm. Formed in Dublin in 2017, their music is described as dark, intense, and introspective, focusing on themes of vulnerability, self-reflection, and emotional turmoil. The group have been incredibly successful thus far – not just around Ireland, but internationally. Band members consist of James McGovern (vocals), Damien Tuit (guitar), Cathal Roper (guitar), Gabriel Paschal Blake (bass), and Diarmuid Brennan (drums). The five-piece released their debut studio album, When I Have Fears, to widespread critical acclaim, followed by sold-out tour across the UK and Europe in promotion of the album. The group have also performed as opening acts for Pearl Jam at the British Summer Time Festival and at the 22nd Coachella Valley Music and Arts Festival in April 2023.
Classed as one of the premier Smiths and Morrissey tribute bands or acts around, These Charming Men, hailing from Dublin, will join Jess on stage for a chat about their career, which stretches as far back as 1995. Described by Andy Rourke as “spooky” and “spot on,” the group have also been praised by Morrissey. “I’m very impressed and flattered by what they do,” the artist commented. Further represented as a must for all The Smiths and Morrissey fans, These Charming Men have been performing at the Annual Smiths/Morrissey Convention in Los Angeles since 2001 and were requested to appear in Morrissey’s absence at the Fuji Rock Festival in Japan 2004 to play to an audience of some 35,000 people.
Kingfishr supporting New Rules at Dolans, Limerick, 1st of February 2023. Copyright Aoife Moloney/hotpress.com’.
Closing out the day at the Hot Press festival HQ, Molly Cantwell catches up with Kingfishr, a trio of lads, based in the Irish capital of music – Limerick. Kingfisher pride themselves on their musical independence, producing, writing and recording all of their own music at their home studio, which happens to be located on a dairy farm. Eddie, McGoo and Fitz met in 2017, studying engineering at University of Limerick, with their debut single ‘flowers-fire’ released the week before their final exams. Since then, they’ve sky-rocketed into major success, opening for the likes of Bruce Springstein, George Ezra, and Dermot Kennedy – an incredible feat in under two years. A fresh spin on Celtic classics is what underpins the band, with their first appearance at Electric Picnic slated to be an exciting one.
4:00pm – King Kong Company
4:45pm – David Keenan
5:30pm – The Murder Capital
6:00pm – These Charming Men
7:00pm – Kingfishr
Saturday kicks off bright and early – well, ish!
Jess Murray will chat with Dublin-based Ispíní na hÉireann who combine original songs and reimagined covers, with influence from the Irish musical tradition, and generally, having the craic. Banjo player Adam J. Holohan and guitar player Tomás Mulligan started the band, slowly but surely adding cellist Aongus MacAmhlaigh, bodhrán extraordinaire Cian ‘Kinko’ Ó Ceallaigh, and saxophone and Uilleann Pipes player Pádraig Óg Mac Aodhagáin. The group also occasionally features guest players from all over the island. The group insist there are layers of subtext and commentary, and a genuine method to all their madness, but they like to leave their art open to the interpretation of the listener.
Credit: Carl McGrath.
Following that, Murray returns with White Horse Guitar Club. Founded 10 years ago in The White Horse in Cork, the group embarked on a journey that flourished into something they could never have dreamed of. Blending the finest Americana with Irish Roots, the 11 men, 11 voices, and 11 guitars, bring an unparalleled sound which the staff at Hot Press are incredibly excited to hear in our tent!
Minding Creative Minds is back with Stuart Clark taking the reigns for their 1:45pm slot. MCM mentor and consultant Sue Cullen and wellness coach Eoin Ryan will be speaking at the Hot Press tent, in no doubt another fascinating conversation with Clark.
Stuart will then stay on to chat with Irish singer-songwriter and rapper, Michael Stafford, better known by his moniker, Maverick Sabre. The London-born, Ireland-reared musician has been releasing music since 2008, with four albums out in the world – Lonely Are the Brave (2012), Innerstanding (2015), When I Wake Up (2019), and Don’t Forget to Look Up(2022). Sabre mentored Jorga Smith, working on a significant amount of music with her. His music has appeared on Grey’s Anatomy, as well as Jools Holland, and way back in 2012, the artist appeared on Jeff Wayne’s Musical Version of The War of the Worlds – The New Generation in the role of Parson Nathaniel.
Maverick Sabre at The 3Olympia, Dublin, 4th of November 2022. Copyright Alex Curran-hotpress.com
Brad Heidi pops by for a quick chat with Lucy O’Toole and a few acoustic tunes. The Galway-based singer-songwriter initially built up a following on the busking circuit after quitting school. Starting out travelling around Ireland with just his acoustic guitar; successful singles, and a lauded performance at Dublin’s New Year’s Eve festival helped him garner a reputation. Brad has now gone from the streets to selling out his first headline Irish tour and London show, as well as recently supporting Lisa Hannigan, Pa Sheehy, James Blunt, and Gavin James. Earlier this year, Heidi was crowned the overall winner of the A New Local Hero talent search by Hot Press and the Independent Broadcasters of Ireland (IBI) as part of Irish Music Month.
Niall Stokes and Brad Heidi at Irish Music Month’s ‘A New Local Hero’ event at The Academy on April 25th, 2023. Copyright Abigail Ring/ hotpress.com
Dave Lofts & SAND follow, chatting with Lucy. Dave Lofts, a Wicklow native based in Waterford, had spent many years captivating audiences in the South East before handpicking the infectious five-piece indie folk band, SAND, to join him on his journey. As they gear up for an epic journey throughout 2023 and 2024, the supergroup of sorts are ready to take their music to new heights. Dave’s soulful and husky voice combined with the chemistry between the group members will eventually lead to the release of their debut album in 2024, promising to be a musical masterpiece music lovers across the emerald isle won’t want to miss!
Pat Carty returns with his beloved cowboy hat, chatting to Irish psych rock band based THUMPER. Founded in 2019 as a solo project by Oisín Leahy Furlong, the guitarist and vocalist expanded the group to perform the songs at festivals. Now including Alan Dooley (guitar, backing vocals), Alex Harvey (guitar, backing vocals), Dav Campbell (bass, backing vocals), Stevie D’Arcy (drums), and Benedict Warner-Clayton (drums), the group have found huge success in the Irish scene and further afield. Their debut album, Delusions of Grandeur, was released in 2022 and was nominated for the Choice Music Prize.
Stuart returns to chat all things Dec Pierce. The radio presenter and well known Irish figure is most popularly known for his radio show ‘Block Rockin’ Beats’, a ’90s dance music show, which the radio host has transformed into a touring performance of the world’s biggest dance anthems, featuring a DJ set with full live band, plus special guests. Recently, Pierce revealed he was hospitalised with a brain haemorrhage, but was back to work, presenting and on tour within a couple of weeks. The 40-year-old father of one is much beloved by the Irish public, consistently selling out gigs all around Ireland.
Pat meets Dingle singer-songwriter Pa Sheehy, who has broken out as a musical stand out upon the break-up of his well-known band, Walking on Cars. Having recently released his second solo EP Lost In A ‘90s Arcade, the artist has found his footing in the folk-indie scene, elevating his sound from that of his previous band. Following his appearance at 2023’s Electric Picnic, the musician is set to play an extensive tour throughout Europe and Ireland, winding down in his native County Kerry.
Cian Ducrot at The Workman’s Club, September 2022. Copyright Laura Klepeisz/hotpress.com
In an exciting interview, Stuart Clark meets Cian Ducrot, the Irish-French TikTok extraordinary, currently taking the world by storm. Featured in out Electric Picnic Hot Press Edition and having recently released his debut album, Victory, Ducrot will have lots to chat about. From opening tours for Ed Sheeran, Dermot Kennedy, and more, we’re extremely excited to hear the stories the Cork man will have to share.
Closing out the evening, Molly Cantwell chats to HamsandwicH just after their performance on the Rankin’s Wood stage. The indie rock band from Kells, County Meath, is made up of Niamh Farrell, Podge McNamee, and Brian Darcy. With four albums under their belt, the group are currently celebrating their 20th birthday year. Initially formed by John Moore, Podge McNamee and Niamh Farrell, the band expanded to bring Brian Darcy and Ollie Murphy on board. In 2010, Moore left the group, with David McEnroe introduced as his replacement. Following the pandemic, the group have shrunk back down to a three-piece, touring Ireland and the UK, playing various festivals throughout the summer of 2023.
Hamsandwich @WildRoots Day 2
1:00pm – Ispíní na hÉiereann
1:15pm – White Horse Guitar Club
1:45pm – Minding Creative Minds
2:30pm – Maverick Sabre
4:00pm – Brad Heidi
4:15pm – Dave Lofts & SAND
4:45pm – THUMPER
5:15pm – Dec Pierce
5:45pm – Pa Sheehy
6:15pm – Cian Ducrot
6:45pm – HamsandwicH
The final day of EP sees an exciting slew of acts come through the Hot Press doors (or whatever tents have).
Stuart Clark kicks of the afternoon with the Lightning Seeds. The English rock band formed in Liverpool in 1989 by Ian Broudie, formerly of the bands Big in Japan, Care, and Original Mirrors. Originally supposed to be a studio-based solo project for Broudie, the Lightning Seeds expanded into a touring band following the album, Jollification. The group experienced huge success throughout the 1990s and have become well known for their single ‘Three Lions’, a collaboration with David Baddiel and Frank Skinner.
Lucy O’Toole chats to two of our Hot for 2023 acts, Scustin and Big Sleep.
Purveyors of pub talkin’ funk, the Bray band, Scustin, are what happens if you stir together Blindboy, Mike Skinner, and Jamiroquai – and add a few pints into the mix. Passionate about their Guinness, excellent at whipping up characters, and putting pressure on the government (in an entertaining manner), Scustin are the perfect combination of hilarious and whimsical. Throughout 2022 and 2023, the band have been performing to sold out crowds, opening for the likes of The Scratch, and bringing their circus-like show to audiences across the biggest Irish festivals.
The Italian-Irish quartet, Big Sleep, are an indie-alternative group, taking influence from classic rock and funk – such as JJ Cale, Jamiroqaui, and Joy Division. Taking off while still at school, the group expanded to incorporate emerging talents from Dublin’s live music scene. The band now balances pop and soul while still maintaining an indie feeling, creating tunes that Irish listeners love. On the lead up to the release of their sophomore EP, the band are popping out banger after banger, maintaining a true-to-themselves sound that matures with the group.
Again, at 1:45pm on Sunday, September 3, MCM mentor and consultant Sue Cullen and wellness coach Eoin Ryan will return to the tent for a panel discussion with comedian, musician, actor and writer Tadgh Hickey; comic Ailish McCarthy; and Hermitage Green’s Dan Murphy, all facilitated by Stuart Clark.
Iseult Dunne and Stuart Clark at Minding Creative Minds Creative Careers Summit on May 17th, 2023. Copyright Abigail Ring/ hotpress.com
Next up, the highly anticipated Indie trio, Rowan, made up of multi-instrumentalists Dylan Howe, Fionn Hennessy-Hayes and Kevin Herron, chat to Stuart. The group’s harmoniously rich compositions are laden with melodies that bring the listener to a familiar place, with a tasteful twist of their own nature. Naming their sound a “melange of Indie,” the group have created an impressive catalogue of music in their short life-span, with undoubtedly something for everyone in there.
Finally, Irish hip-hop duo, Tebi Rex, chat all things EP with Molly Cantwell. Described as alternative indie-hip-hop-pop, their sound by the duality between the duo’s two styles. The duo was formed by Max Zanga and Matt O’Baoill in 2016. Zanga is the rapper while O’Baoill is his melodic counterpart, providing exceptional hooks and choruses. The two are part of the Word Up Collective, a collective of independent Irish artists. Music from the two’s debut album, The Young Will Eat The Old, was featured in Normal People, Sally Rooney’s book-turned-television series. Tebi Rex released their second album, It’s Gonna Be Okay, in 2021.
Tebi Rex. September 2021. Copyright Miguel Ruiz.
1:00pm – Lightning Seeds
1:15pm – Hot for 2023: Scustin
1:30pm – Hot for 2023: Big Sleep
1:45pm – Minding Creative Minds
2:30pm – Rowan
3:15pm – Tebi Rex
Keep an eye on the Hot Press social media for more information, and to see any updates we may have as the weekend goes on.
Chatbots are computer programs that simulate human interaction by interpreting user input and returning the desired result. Rule-based chatbots launch predetermined, constrained answers using either decision tree logic or a database of keywords. These chatbots may provide the user with a prepared menu of choices to choose from or guide the user between alternatives based on a specified set of keywords. More advanced chatbots employ other technologies to produce answers. Chatbots have been extensively embraced by banks, mortgage servicers, and debt collectors.
Access to the correct data at the right time can make a difference between millions in profit and millions in loss in a data-rich industry. According to the report by Consumer Financial Protection Bureau (CFPB), financial institutions are using chatbots more frequently to cut down on the costs of hiring human customer service representatives. They are also moving away from basic, rules-based chatbots and towards more complex technologies like large language models, generative chatbots, and other tools dubbed artificial intelligence (AI).
Chatbots may be helpful for providing simple answers, but as queries become more complicated, their usefulness decreases. The CFPB warns that financial institutions run the risk of breaking federal consumer protection law when implementing chatbot technology because a review of consumer complaints and of the current market shows that some people experience significant negative outcomes due to the technical limitations of chatbot functionality.
Customers’ faith in a business might be severely damaged if chatbots are badly made or if they can’t receive help when they need it. Potential breaches in confidentiality and safety are among these threats. In particular, these chatbots may be programmed to use machine learning or technology sometimes marketed as “artificial intelligence” to mimic human dialogue.
Over 98 million customers engaged with a bank’s chatbot in 2022, and it is anticipated that this figure would rise to 110.9 million users by 2026.
Chatbot use in consumer finance
Financial institutions have historically interacted with both potential and current consumers via a range of methods. The purpose of bank branches is to provide clients with a location close to their homes where they can perform banking transactions and get customer care and support. Relationship banking has long placed a premium on having face-to-face interactions with financial institutions.
Financial institutions have gradually expanded their contact centers previously known as call centers to enable clients to communicate with their institutions more conveniently. As these organizations expanded, a lot of their contact center operations turned to interactive voice response (IVR) technology in order to route calls to the proper parties and save expenses.
The introduction of chat in consumer finance allowed customers to have real-time, back-and-forth interactions over a chat platform with customer service agents. Financial institutions deployed online interfaces for customer support as new technology became available, such as mobile applications, the ability to send and receive messages, or through “live chat.”
The introduction of chatbots, which mimic human replies via computer programming, was mostly done to save the price of hiring actual customer support representatives. In order to automatically generate chat responses using text and voices, financial institutions have recently started experimenting with generative machine learning and other underlying technologies like neural networks and natural language processing. Below, we describe the use of chatbots for customer support.
The technology used as a foundation, such as extensive language models and “artificial intelligence”
Computer programs called chatbots imitate some aspects of human speech. They all take user input and employ programming to create an output, despite the fact that they might differ greatly in terms of complexity, automation, and features.
Read: Unleashing The Power: How Tech Giants Harness Generative AI Chatbots
Rule-based chatbots launch predetermined, constrained answers using either decision tree logic or a database of keywords. These chatbots can provide responses depending on established criteria, such as providing the user with a prepared menu of possibilities to choose from or guiding the user between options based on a specified set of keywords. The user is often restricted to specified inputs. For instance, a bank chatbot may provide the customer with a predetermined selection of alternatives, such as checking their account balance or making a payment.
More advanced chatbots create replies using different technologies. These chatbots may be created to mimic natural conversation using machine learning or other software frequently referred to as “artificial intelligence” (AI). More sophisticated chatbots use LLMs to examine word patterns in large datasets and predict the text that will be used to answer a user’s question.
Exclusive industry insights
Exclusive insights by Dan Adamson CEO of Armilla AI, a company that provides AI validation and development tools to create responsible and robust AI, including AutoAlign, a tool that provides safeguards for generative AI.
The Evolution of Chatbots for Consumer Finance
Chatbots are set to become more prevalent, more powerful, and potentially more dangerous in consumer banking. The biggest banks in America are all now using chatbots for customer service, with an estimated 37% of consumers having engaged them. However, these chatbots are rather “basic” in both scope and intelligence. Most are largely rules-based, with some limited AI integrated (for example, to increase coverage by parsing questions into root types of questions, for example).
Unless you have been living under a rock, you have probably seen much more sophisticated chatbots such as ChatGPT, or Google’s Bard, or Microsoft’s Bing Chat (powered by GPT) that can handle much more complicated queries and have much more intelligent conversational capabilities. These systems are “Large Language Models” (LLMs) and are part of a new powerful class of generative AI. These systems could also be a very powerful tool for consumers: imaging the chat interface becoming the primary interface for interacting with your bank:
“Based on my upcoming bills and credit card billing cycle, can I afford to buy this $400 jacket now or should I wait?”
We can also imagine chatbots connected to other services:
“Look for 2 tickets to tonight’s Yankee’s game in the upper section, and buy them if I have enough room credit on my card.”
While these chatbots could start to get very powerful, there are a number of risks around chatbots, and these will only get worse with the next-generation of chatbots. Even the CFPB is warning of the dangers. For this next generation of chatbots, some older best practices will become even more critical: for example, it will become very hard to tell a chatbot from a human, so the bank will always really need to make it clear to their user that they are, in fact, talking to a bot.
However, builders of these more advanced chatbots will also need to be very careful in new ways, before these new chatbots can be deployed. These new models can be very convincing at making things up (“hallucinating”), and can also change their tone and suddenly be rude to customers. They can also exacerbate biases or be ‘jailbreaked’ to make harmful statements.
Luckily, there are also technologies emerging to protect against these risks – for example AutoAlign, can help automate testing and fine-tuning of these models, and can provide guardrails to reduce biases, hallucinations and jailbreaking.
It will be interesting to see how quickly financial institutions can roll out this new next generation of chatbots and what it means for consumers – hopefully the right balance can be struck where chatbots become more useful while not creating new harms.
Read: How Can Banks Stay Competitive With A Secured IT Security Infrastructure?
Use of large language models and AI
Chatbots are software applications that attempt to simulate human dialogue. All of these systems take in user input and process it according to predetermined guidelines in order to generate an output; nevertheless, they can vary widely in terms of complexity, automation, and features.
Read: A Global Map Of Cryptocurrency Regulations
The predetermined, limited responses of rule-based chatbots are triggered by either a decision tree logic or a database of keywords. These chatbots can either provide the user with a prepared list of options to choose from or guide them through the available choices based on a specified set of keywords and predetermined rules. In most cases, the user’s choices for input are severely restricted.A chatbot provided by a financial institution may present the user with a limited amount of alternatives, such as viewing the account balance or making a payment.
Additional technologies are used by more advanced chatbots to generate responses. Machine learning and other forms of so-called “artificial intelligence” could be incorporated into the design of such chatbots in order to better mimic human conversation. LLMs are also used by more advanced chatbots to examine the relationships between words in vast datasets and make predictions about the best course of action when responding to a user’s query.
Domain-specific chatbots, such those designed for the healthcare, academic, and financial sectors, are focused on assisting users with narrower goals.The financial sector is the primary focus of our investigation.
Growth and adoption in the financial industry
In the financial sector, chatbots are prominently displayed on websites, smartphone apps, and social media pages of banks, mortgage servicers, debt collectors, and other financial firms. Over 98 million people, or over 37% of the US population, interacted with a bank’s chatbot in 2022-23. By 2026, this figure is anticipated to reach 110.9 million users.
Notably, the top 10 commercial banks in the nation all engage consumers using chatbots of differing levels of sophistication. These chatbots may sometimes be given human identities, employ popup features to entice interaction, and even send and receive direct messages on social media. Certain qualities, such as their 24/7 availability and prompt replies, may help to explain why financial organizations are using chatbots to offer customer service. Adoption may also be motivated by cost reductions for these institutions. For instance, statistics indicate that as compared to customer service models that use human agents, chatbots save $8 billion annually or almost $0.70 for each customer encounter.
Since they first entered the financial industry almost ten years ago, chatbots have become increasingly popular. Today, the majority of the sector at least employs straightforward rule-based chatbots that use either decision tree logic or databases of keywords or emojis that set off pre-programmed, constrained responses. These chatbots are often driven by exclusive third-party technology firms. For JPMorgan Chase and TD Bank, for instance, Kasisto offers conversational, money-focused chatbots, while Interactions serves Citibank.
As the use of chatbots has increased, some companies, like Capital One, have developed their own chatbot technologies by using real customer conversations and chat logs to train algorithms. Capital One released Eno, a text messaging chatbot, in March 2017. Capitol One claims that Eno, like other banking chatbots, can do things like monitor account balances, recent transactions, and available credit; pay bills; activate, lock, or replace a card; and keep track of when payments are due. By the fourth quarter of 2022, nearly 32 million users will have had more than a billion exchanges with Erica.
In Q1 2023, for instance, Goldman Sachs‘ chief information officer suggested that the bank’s engineering staff start creating its own “ChatGS” or LLM chatbot to help the bank’s employees store knowledge and respond to important customer questions immediately. Financial institutions have only lately begun to implement cutting-edge technologies like generative chatbots and other products billed as “artificial intelligence.”
The industry has improved its use of chatbot technology by, in certain cases, depending on the biggest technological corporations for datasets or platforms. For instance, Truist announced its digital assistant in Q3 2022, built on top of Amazon Lex, an AWS product. Wells Fargo announced the launch of Fargo in Q3 2022, a new chatbot virtual assistant using Alphabet’s Google Cloud platform that will use LLMs to process customers’ input and offer tailored responses. Additionally, Morgan Stanley announced that it is testing a new chatbot that is powered by Microsoft-backed OpenAI’s GPT 4 technologies.
The capabilities of chatbots go beyond text messages. Consider the U.S. In June 2020, Bank released Smart Assistant through a smartphone app. Like other banking chatbots, Smart Assistant adheres to simple rule-based functionality designed to perform daily banking tasks, such as locating users’ credit scores, transferring money between accounts, contesting transactions, and facilitating payments to other users through Zelle. Smart Assistant responds primarily to voice prompts and accepts text inquiries as a backup option.
Many financial institutions are now using rule-based chatbots that are supported by social media sites as well. The majority of the top 10 banks in the US allow business chat and direct communications on either Twitter or Meta’s Facebook and Instagram. The Business Chat on Facebook and Instagram, among other things, generates automatic answers.
Problems consumers encountered while engaging with chatbots
Sectors all around the economy are increasingly transitioning from human help to algorithmic support, just as how customer service changed decades ago from in-person to distant contact centers.
Public complaints about consumer financial services and products are gathered by the CFPB. As financial institutions employ chatbots more often, complaints from the general public outline problems consumers encountered while engaging with chatbots more frequently. These problems may raise concerns about whether current laws are being followed. Below, we include some of the difficulties encountered by consumers as described in CFPB complaints. We also look at problems that chatbot adoption has created for the whole industry.
Read: Most Trending Crypto Wallet Of 2023 – Phantom
Limited ability to solve complex problems
The phrase artificial intelligence, or “AI,” is used to imply that a consumer is interacting with a highly complex system and that the answers it produces are actually accurate and intelligent. However, “AI” and automated technology may take many different shapes. People could really be interacting with a very simple system that is just capable of fetching and repeating basic information or sending users to FAQs or regulations. Chatbots should not be used as the main method of customer care when “AI” cannot grasp the customer’s request or when the customer’s message conflicts with the system’s programming.
Difficulties in recognizing and resolving peoples’ disputes
Financial institutions are trusted by the public to promptly and accurately recognize, investigate, and settle problems. Entities must correctly recognize whether a customer is voicing a complaint or a dispute since it is part of these consumer expectations and regulatory duties. A degree of rigidity may be introduced by chatbots and carefully scripted customer service agents, wherein only certain phrases or syntax may signal the identification of a problem and launch the dispute resolution procedure. As a consequence, it’s possible that chatbots and scripts won’t be able to recognize a conflict.
Even when chatbots can recognize that a consumer is raising a disagreement, there can be technological barriers preventing them from investigating and resolving it. Customers may dispute purchases or inaccurate information. Chatbots that are only capable of repeating back to customers the same system information that they are seeking to dispute are inadequate. Such echoing back does not effectively address issues or questions.
The fact that rule-based chatbots are designed to receive or process user-provided account information and cannot reply to requests that fall outside of their data inputs makes them one-way streets as well. Because the technology has only been trained on a small number of dialects, it can be challenging for customers with varying linguistic demands to utilize chatbots to get assistance from their financial institutions. This is especially true for those who speak English as a second language.
A chatbot with limited syntax can feel like a command-line interface where customers need to know the correct phrase to retrieve the information they are seeking. Going through the motions of a simulated discussion, although being touted as more convenient, may be tiresome and opaque in comparison to exploring material with simple and logical navigation.
Providing inaccurate, unreliable, or insufficient information
The repercussions of being incorrect may be severe when a person’s financial life is at stake, as is shown in more detail below. Chatbots sometimes provide the incorrect response. When financial institutions use chatbots to provide customers with information that must be accurate under the law, getting it incorrect might be against their duties.
Particularly, sophisticated chatbots that use LLMs may struggle to provide information that can be relied upon. The underlying statistical approaches are not well-suited to discern between factually valid and erroneous data for conversational, generative chatbots trained on LLMs. Because of this, these chatbots could use datasets that include examples of false or misleading information, which are subsequently duplicated in the material they produce.
Chatbots may provide erroneous information, according to recent research. For instance, a comparison study of Alphabet’s LaMDA, Meta’s BlenderBot, and Microsoft-backed OpenAI’s ChatGPT revealed that these chatbots frequently produce inaccurate results that are undetectable by some users. Recent tests of the Alphabet’s Bard chatbot revealed that it also produced fictional results. Additionally, a recent study revealed that Microsoft-backed OpenAI’s ChatGPT can exacerbate biases in addition to producing inaccurate results.
Users are asking generative chatbots for financial advice, despite the fact that they can be inaccurate. For instance, according to a survey, people use LLM chatbots to get recommendations and advice on credit cards, debit cards, checking and savings accounts, mortgage lenders, and personal loans.
The goal of using chatbots in banking is to provide consumers with quick assistance with their problems. Customers could have no other options if a chatbot is supported by faulty technology, erroneous data, or is nothing more than a portal into the business’s open policies or FAQs. For financial institutions, it is essential to respond to customers’ questions about their financial life in a trustworthy and accurate manner.
Failure to provide meaningful customer assistance
Automated replies from a chatbot could not be able to assist a client with their problem and instead drag them around in endless loops of repetitious, useless language or legalese without providing an exit to a real customer support agent. These “doom loops” are often brought on when a customer’s problem is beyond the chatbot’s scope, preventing them from having a thorough and maybe required interaction with their financial institution. As mentioned previously, some chatbots use LLMs to produce answers to frequent client questions. While some individuals may be able to use a chatbot to acquire an answer to a particular question, the same technology might make it difficult to get a precise and trustworthy response.
Financial organizations could claim that automated processes are more effective or efficient because, for instance, a person might get a response right away. Automated answers, however, may be heavily scripted and sometimes just point consumers to long policy statements or FAQs, which, if they do include any information at all, may not be particularly useful.
These systems may just be shifting to a less expensive automated procedure the obligation of skilled customer service representatives or the stress of a person needing to explain such rules. As a consequence, a number of clients have complained to the CFPB about chatbots trapping them in “doom loops”.
Hindering access to timely human intervention
Chatbots are often unable to substantially customize services for distressed consumers since they are intended to handle certain tasks or obtain information. Customers may experience anxiety, tension, confusion, or frustration when they seek help with financial issues. The restrictions of a chatbot may prevent the client from accessing their essential financial information and raise their irritation.
Research has shown that when individuals feel worried, their attitudes toward risk and choices change. One study, for instance, indicated that after interacting with a chatbot, 80% of customers felt more annoyed and 78% felt the urge to speak to a human.
Additionally, clients may not be aware of the limitations of these chatbots or the absence of a real customer support agent when they first decide to establish a connection with a certain financial institution. It won’t become obvious until a problem arises and consumers are forced to spend time and energy trying to fix it, which wastes their time, limits consumer options, and undercuts financial institutions that are attempting to compete by making substantial and efficient customer service investments.
Advanced technology deployment may also be a deliberate decision made by organizations looking to increase revenue or reduce write-offs. Indeed, the likelihood of charge waivers or price negotiations may be lower for sophisticated technology.
Technical limitations and associated security risks
Reliability of the system and downtime
At a high level, how a company decides to prioritize features and allocate development resources has an impact on how reliable automated systems will be. For instance, enhancing automated systems’ capacity to recommend pertinent financial goods to a given consumer based on their data may be of more importance in order to boost income. This investment could come at the price of functions that don’t increase sales. Therefore, even if an automated system is adept at managing certain client duties, it could struggle with others.
Chatbots may also malfunction, just like any other piece of technology. People can be trapped with little to no customer care if a malfunctioning chatbot is their financial institution’s sole option for providing time-sensitive assistance. We may see some of the chatbots’ technological limits through customer complaints, for instance.
A disappointed consumer just sees a non-functional chatbot, regardless of whether it is a programming or software problem.
Threats to computer security from spoofing and phishing websites
Because they are automated, chatbots are often utilized by criminals to create phoney, impersonation chatbots that carry out mass phishing assaults. Conversational bots often appear as “human-like,” which might make users think more highly of themselves and divulge more information than they would in a straightforward online form. When consumers provide these impersonator chatbots with their personal information, it becomes rather risky. Scammers are increasingly preying on users of popular messaging services to get their personal or financial data in order to fool them into paying fictitious fees via money transfer apps.
Chatbots may be trained to phish for information from another chatbot in addition to utilizing impersonating chatbots to hurt customers. Chatbots may be programmed to adhere to specific privacy and security protocols in these circumstances, but they may not be able to identify and react to attempts by scammers to phish for personal information or steal peoples’ identities. They may also not be programmed to detect suspicious patterns of behavior or impersonation attempts.
For instance, in Q2 2022, fraudsters pretended to be DHL, a corporation that provides fast mail and package delivery services and sent victims to a chatbot to ask for extra shipping fees in order to get products. Due to the inclusion of a captcha form, email and password questions, as well as a picture of a damaged shipment, the chatbot dialogue seemed to be legitimate.
Keeping personally identifiable information safe
Regardless of the technology employed, financial institutions are required to keep personally identifiable information secure. Security researchers have highlighted a number of potential chatbot vulnerabilities, from entities using antiquated and insecure web transfer protocols to consumers entering personal information when they are in need of assistance.
For instance, users are often asked to provide personal data in order to verify that they are the owner of a certain account. Customers expect a corporation to handle their personal and financial information with care and with confidence when they provide it to them. As a result, chat logs that include personally identifiable information from consumers should be treated as sensitive consumer information and maintained safe from hacking or intrusion.
Chat logs provide a new entry point for privacy violations, making it more difficult to completely secure the security and privacy of customers’ personal and financial information. Inbenta Technologies was used by Ticketmaster UK in 2018 for a number of services, including a “conversational AI” on its payments website. 9.4 million data subjects were impacted by the intrusion, which included 60,000 unique payment card numbers when hackers attacked Inbenta servers.
Privacy violations may occur when training data contains personal information that is then directly disclosed by the model without the consent of the individual affected. Some of these risks appear to be recognized by financial institutions, at least as it pertains to their internal information, as several large banks have restricted use.
For AI systems like chatbots, comprehensive security testing is required. This testing must be rigorous, and any third-party service providers participating in operations must be thoroughly audited. Without adequate safeguards, there are just too many weaknesses for these systems to be trusted with sensitive client data.
The CFPB’s consumer complaints about chatbots raise questions about whether using chatbots prevents institutions from safeguarding the security of customer data.
The perils of implementing flawed chatbots
Financial institutions should take into account the technology’s limits, such as those described in this research, as they continue to invest in tools like chatbots to manage customer service while also cutting costs. Risks associated with using chatbot technology as the main method of communicating with people include the following for certain financial institutions:
Possibility of breaking federal consumer financial regulations
Federal consumer finance regulations that were approved by Congress impose a range of pertinent requirements on financial firms. These requirements, among other things, include giving clients honest responses, which helps to guarantee that financial institutions treat customers fairly.
Financial institutions bear the danger that the data chatbots give may not be correct, the technology may not recognize when a client is claiming their federal rights, or it may fail to secure their privacy and data when chatbots ingest user messages and respond.
When chatbots replace humans, customer service and trust suffer.
Consumers may have urgent needs when they contact their financial institution for assistance. Their confidence and trust in their financial institution will decline if they get bogged down in loops of monotonous, useless language, are unable to activate the proper rules to receive the answer they want, and are not able to speak with a live customer support agent.
People may not have much negotiating power when a supplier is chosen for them, given the nature of the marketplaces for many consumer financial goods and services. For instance, choosing a mortgage servicer or credit reporting agency offers little to no customer choice. Additionally, even in marketplaces where customers have more options, financial institutions may not fiercely compete on key aspects, such as customer service, because clients are only exposed to those benefits after choosing the provider and are therefore in a way tied into them.
In these situations, the potential for significant cost savings may significantly encourage institutions to direct customer care via chatbots or other automated systems, even if doing so somewhat degrades the customer experience. Importantly, the amount of cost savings that are passed on to customers in the form of improved goods and services is reduced in such marketplaces where competition is limited or absent.
Financial firms could even go so far as to limit or stop providing individualized human help. However, it is probable that trust and service quality will suffer as a result of this decline. This trade-off is particularly relevant to consumer groups like those with poor technological availability or low English proficiency when chatbot engagements have greater rates of unsuccessful resolution.
Risk of harming people
In addition to losing the confidence of customers, chatbot failures in the marketplaces for consumer financial goods and services have the potential to have far-reaching negative effects. When someone’s financial security is in jeopardy, there are significant consequences for making a mistake. It’s crucial to be able to identify and manage consumer complaints since there are occasions when it’s the only practical method to quickly fix a mistake before it leads to even worse consequences. It might be disastrous to provide false information about, say, a consumer financial product or service. It could result in the imposition of unjustified costs, which might in turn result in poorer outcomes like default, the client choosing a subpar choice or consumer financial product, or other disadvantages.
Therefore, it might be harmful for a person to ignore or avoid a conflict. It might undermine their confidence in the organization, discourage people from seeking assistance with problems in the future, lead to frustration and time waste, leave manageable problems unresolved, and have growing negative effects.
Financial organizations run the danger of offending their consumers and doing them serious damage, for which they might be held liable, by using subpar chatbots.
It emphasizes the danger that relationship banking faces from algorithmic banking by continuing the now-familiar theme of animosity against AI and algorithms. It’s unclear if the CFPB will really raise what seems to be customer service concerns to claimed legal breaches, despite the fact that the statement undoubtedly seems to be intended to get financial institutions to pay particular attention to the issues the bureau claims exist.
Some of the difficulties in using chatbots in consumer financial services are highlighted in this paper. There will probably be a number of significant financial incentives to switch away from the help provided in-person, over the phone, and via live chat as industries throughout the economy continue to incorporate “artificial intelligence” technologies into customer service operations.
Poor chatbots that restrict access to live, human help may result in legal infractions, reduced service, and other negative effects. The CFPB will continue to carefully examine a variety of long-term effects of the move away from relationship banking and towards algorithmic banking.
Read: What Is Data Science?
[To share your insights with us, please write to firstname.lastname@example.org]