Dates
Plenary
Fri 20 FebDisplayed time zone: Chennai, Kolkata, Mumbai, New Delhi change
Fri 20 Feb
Displayed time zone: Chennai, Kolkata, Mumbai, New Delhi change
09:00 - 09:30 | |||
09:30 - 10:30 | |||
09:30 60mKeynote | Software Dependencies: Then, Now, and What’s Next ISEC 2026 Keynotes Sarah Nadi New York University Abu Dhabi | ||
12:40 - 14:00 | |||
15:00 - 16:00 | |||
16:30 - 18:00 | |||
19:00 - 22:00 | |||
Sat 21 FebDisplayed time zone: Chennai, Kolkata, Mumbai, New Delhi change
Sat 21 Feb
Displayed time zone: Chennai, Kolkata, Mumbai, New Delhi change
09:00 - 09:45 | |||
09:00 45mTalk | Towards Neural Synthesis for SMT-Assisted Proof-Oriented Programming ISEC 2026 Keynotes Saikat Chakraborty Microsoft Research | ||
09:45 - 10:30 | |||
09:45 45mTalk | A Transferability Study of Interpolation-Based Hardware Model Checking for Software Verification ISEC 2026 Keynotes Nian-Ze Lee National Taiwan University, Taiwan | ||
11:00 - 11:30 | |||
11:00 30mTalk | Impact of Feature Selection Techniques on Bug Prediction Models ISEC 2026 Keynotes | ||
11:30 - 12:45 | |||
12:40 - 14:00 | |||
14:00 - 15:00 | |||
14:00 60mKeynote | Shipping Models, Not Just Code: How AI Is Forcing Software Engineering to Evolve ISEC 2026 Keynotes Siddhartha Asthana Mastercard | ||
17:30 - 18:00 | |||
Call for Student Posters
ISEC is a premier international conference that brings together researchers, practitioners, and educators to explore the latest advancements, trends, and challenges in the field of software engineering.
As part of ISEC 2026, we are excited to host a Student Posters Session, a vibrant forum where students can present their ongoing research work to the academic and professional community.
Why Participate?
- Receive constructive feedback from domain experts
- Improve research communication skills
- Network with leading researchers and industry professionals
Abstracts for all accepted posters will be included in the Posters Report, which forms part of the conference proceedings to be published in the ACM Digital Library.
Best Poster Awards
A panel of domain experts will evaluate the posters based on the significance of the research problem and the clarity of presentation. Prizes will be awarded to the best poster submissions.
Eligibility
The poster session is open to students who:
- Are currently enrolled in an undergraduate or post-graduate program, OR
- Have recently completed their undergraduate or master's degree and are employed as a research associate or research fellow
If you're uncertain about eligibility, feel free to contact the poster session chairs:
- Harshita Bhargava (harshita.bhargava@iisuniv.ac.in)
- Nilotpal Chakraborty (nilotpal@iiitg.ac.in)
- Sunil Kumar (sunil@lmniit.ac.in)
Suggested Themes
- Open Source Communities in Software Engineering Practices
- Software Engineering in Edge and IoT Environments
- AI-assisted Software Development
- Privacy-Preserving and Secure Software Engineering
- Impact of Generative AI in Software Development
- Sustainability and Green Software Engineering
- Formal Methods in Software Testing and Verification
- Innovations in Software Engineering Pedagogy
Submission Guidelines
- Submit a 2-page abstract using the Submission Form.
- Clearly describe the problem, significance, approach, and results (if available).
- Early-stage research ideas are welcome.
Important Dates
- Poster Abstract Submission Deadline: 05 January 2026
- Notification of Acceptance: 19 January 2026
- Student Posters Session: During ISEC 2026
All submissions will undergo peer review. Selected abstracts will be invited to participate in the poster session at ISEC 2026. Detailed poster preparation guidelines will be shared upon acceptance.
Data Science Challenge
ISEC 2026 – Data Science Challenge
Description
In 2026, ISEC will host its Student Data Challenge Competition (SDC). SDC provides a unique platform for undergraduate, graduate (Masters or early PhD), aspiring data scientists, and analysts to showcase their abilities and compete against their peers. The competition is hosted in conjunction with the ISEC conference, offering participants the opportunity to network with industry experts, gain valuable insights, and receive recognition for their achievements.
The Challenge
Predict Faulty Code Using Static Code Metrics
Context
Software quality assurance teams spend enormous effort detecting and fixing bugs in large codebases. Many defects hide inside seemingly simple files — but static code metrics often reveal deeper complexity patterns.
Objective
The data challenge focuses on predicting whether a software code instance is faulty or non-faulty using machine learning (ML) or deep learning (DL) techniques. Participants must develop predictive algorithms capable of identifying patterns within static code metrics to detect defects early.
Task
Predict whether a code instance is faulty or non-faulty using only structural metrics extracted from the source code.
Dataset Provided
train.xlsx – Code metrics with Fault label (4,236 instances)
test.xlsx – Code metrics only (1,816 instances)
Target label: Fault {0, 1}
test.xlsx – Code metrics only (1,816 instances)
Target label: Fault {0, 1}
Competition Rounds:
Round 1 — Online (Kaggle): Automatic evaluation using Macro F1-score.
Round 2 — Offline Finale: Presentation of solutions with emphasis on robustness and explainability (XAI).
Round 1 — Online (Kaggle): Automatic evaluation using Macro F1-score.
Round 2 — Offline Finale: Presentation of solutions with emphasis on robustness and explainability (XAI).
Submission File Format
CSV file with 1,816 entries and two columns:
1. Issue-ID
2. Fault (0 = clean, 1 = faulty)
1. Issue-ID
2. Fault (0 = clean, 1 = faulty)
Evaluation Metric
Macro F1-score
For queries, contact Dr. Amita Sharma and Dr. Anukriti Bansal.
Deadline: 16 January 2026
Deadline: 16 January 2026