top of page

Mentored by

Computer-Engineering-KIIT-Retina-Logo.png

School of Computer Engineering,

KIIT Deemed to be University, Bhubaneswar, India 

Computer-Engineering-KIIT-Retina-Logo.png

School of Computer Engineering,

KIIT Deemed to be University, Bhubaneswar, India 

Tech4Change: CINE 2026 HACKATHON in collaboration with JUSENSE

Overview

The Tech4Change (CINE 2026 Hackathon), in collaboration with JUSENSE, invites participants to develop AI-powered solutions addressing challenges in Urban Infrastructure. Rapid urbanization demands smarter, more sustainable, and more adaptive infrastructure systems that can efficiently manage resources, mobility, safety, and the environment.

Rules of Engagement

📌Team participation is essential. Each team can have at least two (2) to a maximum of four (4) members. 

📌 Eligibility: UG (B.Tech, BCA, etc.), PG (M.Tech, MCA, etc) and Young Professionals

📌The team must register first to participate in the Tech4Change Hackathon (pay the necessary amount and fill out the form).

📌Registration is INR 500 per member (e.g., INR 1000 for a two-member team) 

Prize Money

Winner (First Place): INR 12000

Runner up (Second Place): INR 8000

Background

Intelligent Footpath Condition Detection, Monitoring, and Mapping using AI-Driven CPS

Pedestrian pathways such as footpaths or pavements play a vital role in promoting safe, accessible, and sustainable urban mobility. However, in many cities, these footpaths are cluttered with obstacles, damaged, or encroached upon, making them unsafe or unusable for pedestrians. Conventional inspection and maintenance methods are manual, time-consuming, and inefficient, often failing to capture dynamic urban conditions.


Hence, there is a strong need for an AI-driven, scalable, and automated monitoring framework capable of identifying, classifying, and mapping footpath conditions in real time.

Challenge

Design and develop a Deep Learning-powered Cyber-Physical System (CPS) for automated detection, classification, and geotagging of footpath/pavement conditions using visual and
location data.


The system should:

  • Analyze camera images (smartphone, vehicle-mounted, or drone-based).

  • Classify each segment into categories such as "Fully Occupied," "Semi-Occupied," or "Free for Use."

  • Generate real-time or near-real-time map-based visualizations for urban authorities to support decision-making on pedestrian safety and maintenance.

Scope for Innovation (Flexible Dimensions)

  • Data Source: Smartphone or vehicle-mounted camera images/videos, drone footage, CCTV feeds, or open footpath datasets.

  • Example Dataset: Participants may use existing open datasets, generate their own datasets, or utilize the following sample dataset.

                    🔗 https://www.kaggle.com/datasets/afifaniks/footpath-image-datase

  • Sensing Modality: RGB images, video streams, or GPS-tagged media.

  • Modeling Techniques: Deep learning (e.g., YOLOv8, MobileNet, EfficientNet, Vision Transformers) or hybrid models for object detection and scene understanding.

  • Deployment: Cloud-based, edge AI, or federated setups for CPS integration.

  • Visualization: Web-based dashboard or interactive GIS-enabled visualization for city planners.

Sub Tasks

  • SubTask 1 (Annotated Dataset): Prepare or use an existing annotated dataset containing various footpath conditions (occupied, semi-occupied, free).

  • SubTask 2 (AI Model Development): Build and train a deep learning model to classify and localize footpath segments based on obstruction levels or damage.

  • SubTask 3 (CPS Integration): Implement a pipeline where an edge device (e.g., smartphone) captures images → sends data to a local/server model → generates predictions → updates the map.

  • SubTask 4 (Web Dashboard): Develop a dashboard that visualizes categorized footpath segments on a geo-map, enabling urban monitoring and planning.

Expected Deliverables

  • Trained AI model for multi-class footpath condition detection.

  • Annotated dataset (self-created or open-source).

  • A working CPS prototype (data capture → model inference → map visualization).

  • Web dashboard showing categorized footpath or pavement segments.

  • Evaluation metrics (Accuracy, Precision, Recall, F1-score, mAP@[IoU=0.5]).

  • Presentation or report (max 15 slides / 4 pages) covering dataset, model, pipeline, and deployment details.

Hackathon Structure

  • Round 1: Solution Design and Model Development

    • Share code (GitHub) containing the dataset, model, and implementation details.

    • Provide a 4-page technical summary including:

      • Problem understanding and methodology

      • Dataset and preprocessing

      • Model architecture and performance metrics

      • Preliminary dashboard visualization (optional)
         

  • Round 2: Evaluation and Shortlisting

    • Submissions will be evaluated by domain experts based on innovation and technical depth, and feasibility of CPS integration.

    • Top-performing teams will be shortlisted for the final round based on:

      • Accuracy and robustness of model

      • Quality of documentation and visualization

      • Novelty and scalability of approach
         

  • Round 3: Final Round (Offline Presentation & Demo)

    • Finalists will present their complete solution, covering:

      • Model architecture, deployment pipeline, and CPS prototype

      • Web dashboard demonstration with live or recorded video

      • Key insights, limitations, and potential real-world use cases

      • Teams must submit a 15-slide presentation showcasing their final work.

Computer-Engineering-KIIT-Retina-Logo.png

Organized by

© 2022-26 CINE Conference Community

  • Grey LinkedIn Icon
  • Grey Facebook Icon
bottom of page