HKBU Integrates AI and Art-Tech in Gala Concert

The Hong Kong Baptist University (HKBU) held a press conference on April 27, 2026, to preview the HKBU Symphony Orchestra Annual Gala Concert 2026, which will blend symphonic music, digital technology, and artificial intelligence, PR Newswire reported. The concert will take place at the East Kowloon Cultural Centre as part of HKBU's 70th anniversary and is described in reporting as the venue's first orchestral performance combining art-tech and AI since opening, PR Newswire and Bandwagon wrote. The programme includes a reimagined performance of "The Legend of WuKong" in collaboration with 8082Audio, AI-generated virtual avatars, holographic augmented-reality elements, and a guest appearance by humanoid robot Sophia (Hanson Robotics), who will perform three songs with the live orchestra, Antara and Bandwagon reported. Editorial analysis: This event highlights ongoing industry interest in applying real-time AI and immersive art-tech to live performance settings.
What happened
The Hong Kong Baptist University (HKBU) held a press conference on April 27, 2026, to introduce the HKBU Symphony Orchestra Annual Gala Concert 2026, PR Newswire reported. The concert will be staged at the East Kowloon Cultural Centre as part of HKBU's 70th anniversary, and public reporting describes it as the venue's first orchestral production to explicitly combine art-tech and AI since the centre opened, PR Newswire and Bandwagon wrote. The event theme is "Live Music ReIMAGINEd," according to PR Newswire.
What was announced
Public coverage lists several headline elements for the programme: a reimagined performance of "The Legend of WuKong" in collaboration with 8082Audio, HKBU student choreography recreating scenes from the game _Black Myth: Wukong_, AI-generated virtual avatars displayed on large screens, and holographic augmented-reality components, Bandwagon reported. Guests at the press conference included Professor Johnny Poon, Professor Chen Jie, Dr Lam Kwan-fai, and Mr Taurin Barreras, per PR Newswire. A major feature is the guest appearance by humanoid robot Sophia, developed by Hanson Robotics, who is reported to sing three original songs accompanied by the live orchestra, Antara and Bandwagon reported. Antara also reported that Dr Lam Kwan-fai and InterMusic Production arranged the three songs for Sophia.
Technical details
Bandwagon and PR Newswire describe the production mixing live orchestral audio with cinematic visuals, AR-enabled avatars, and stage choreography; Bandwagon explicitly mentions holographic AR interaction between avatars and human dancers. Reporting also links this edition to HKBU's prior experiments with brainwave technology in performances, framing the 2026 show as an evolution of that work, PR Newswire and Antara stated. Professor Johnny Poon is quoted in the press materials saying, "HKBU has always been at the forefront of art tech. The University has conducted collaborations in art tech, artificial intelligence and music performances since 2022," Antara and PR Newswire reported.
Editorial analysis - technical context
Industry-pattern observations: Live performances that combine orchestral audio, real-time avatar rendering, and holographic AR typically require tight synchronization across audio, visual rendering, and stage cues. Companies and production teams undertaking similar integrations often confront latency management, multichannel audio mixing, and lighting/visibility tradeoffs when projecting avatars in physical space. Real-time vocal synthesis or voice processing for robotic performers also imposes constraints on monitoring and mixing workflows used by sound engineers.
Context and significance
Editorial analysis: The HKBU production exemplifies a growing, practice-oriented strand of AI deployment in the arts where institutions pair established creative programs with external art-tech specialists and robotics vendors. For practitioners in media, sound engineering, and interactive graphics, the concert is a practical demonstration of how academic theatres and cultural centres are experimenting with hybrid live-virtual experiences outside purely studio conditions.
What to watch
For practitioners: observers should track the event's handling of real-time synchronization (audio latency, avatar lip-sync, stage-AR registration), the sound-mixing approach used for a robotic vocalist versus human singers, and how the production integrates pre-rendered cinematic content with live performance. For researchers: any post-show technical writeups, production notes, or academic outputs from HKBU teams would be useful for understanding implementation choices and tradeoffs. For the public record: reporting attributes the artistic and technical claims to HKBU press materials and third-party coverage; the university's own press release pages provide additional event details.
Scoring Rationale
This is a notable demonstration of AI and immersive art-tech applied to live performance, with practical relevance to audio-visual and interactive media practitioners. It is not a frontier-model or infrastructure release, so its technical impact is moderate.
Practice interview problems based on real data
1,500+ SQL & Python problems across 15 industry datasets — the exact type of data you work with.
Try 250 free problems

