Accurate Generation of Trigger-Action Programs with Domain-Adapted Sequence-to-Sequence Learning
Trigger-action programming allows end users to write event-driven rules to automate smart devices and internet services. Users can create a trigger-action program (TAP) by specifying triggers and actions from a set of predefined functions along with suitable data fields for the functions. Many trigger-action programming platforms have emerged as the popularity grows, e.g., IFTTT, Microsoft Power Automate, and Samsung SmartThings. Despite their simplicity, composing trigger-action programs (TAPs) can still be challenging for end users due to the domain knowledge needed and enormous search space of many combinations of triggers and actions. We propose RecipeGen, a new deep learning-based approach that leverages Transformer sequence-to-sequence (seq2seq) architecture to generate TAPs on the fine-grained field-level granularity from natural language descriptions. Our approach adapts autoencoding pre-trained models to warm-start the encoder in the seq2seq model to boost the generation performance. We have evaluated RecipeGen on real-world datasets from the IFTTT platform against the prior state-of-the-art approach on the TAP generation task. Our empirical evaluation shows that the overall improvement against the prior best results ranges from 9.5%-26.5%. Our results also show that adopting a pre-trained autoencoding model boosts the MRR@3 further by 2.8%-10.8%. Further, in the field-level generation setting, RecipeGen achieves 0.591 and 0.575 in terms of MRR@3 and BLEU score respectively.
Sun 15 MayDisplayed time zone: Eastern Time (US & Canada) change
22:30 - 23:20 | Session 2: Program Representation 1Research at ICPC room Chair(s): Fatemeh Hendijani Fard University of British Columbia | ||
22:30 7mTalk | Zero-Shot Program Representation Learning Research Nan Cui Shanghai Jiao Tong University, Yuze Jiang Shanghai Jiao Tong University, Xiaodong Gu Shanghai Jiao Tong University, China, Beijun Shen School of Electronic Information and Electrical Engineering, Shanghai Jiao Tong University Pre-print Media Attached | ||
22:37 7mTalk | On The Cross-Modal Transfer from Natural Language to Code through Adapter Modules Research Divyam Goel Indian Institute of Technology Roorkee, Ramansh Grover Delhi Technological University, Fatemeh Hendijani Fard University of British Columbia Pre-print Media Attached | ||
22:44 7mTalk | Self-Supervised Learning of Smart Contract Representations Research Shouliang Yang School of Software, Shanghai Jiao Tong University, Beijun Shen School of Electronic Information and Electrical Engineering, Shanghai Jiao Tong University, Xiaodong Gu Shanghai Jiao Tong University, China Pre-print Media Attached | ||
22:51 7mTalk | An Exploratory Study on Code Attention in BERT Research Rishab Sharma University of British Columbia, Fuxiang Chen University of British Columbia, Fatemeh Hendijani Fard University of British Columbia, David Lo Singapore Management University Pre-print Media Attached | ||
22:58 7mTalk | Accurate Generation of Trigger-Action Programs with Domain-Adapted Sequence-to-Sequence Learning Research Imam Nur Bani Yusuf Singapore Management University, Lingxiao Jiang Singapore Management University, David Lo Singapore Management University DOI Pre-print Media Attached | ||
23:05 15mLive Q&A | Q&A-Paper Session 2 Research |