File size: 2,892 Bytes
007d70f
 
 
c7ebd48
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
import os
os.environ['TRANSFORMERS_CACHE'] = '/app/.cache'  # ✅ Set local writable cache

from flask import Flask, request, jsonify
from flask_cors import CORS
from flask_sock import Sock
import uuid
import time
import requests
from transformers import pipeline

# Initialize Flask app and WebSocket
app = Flask(__name__)
CORS(app)
sock = Sock(app)

# AI classification pipeline using Facebook's zero-shot model (lightweight for free-tier)
classifier = pipeline("zero-shot-classification", model="facebook/bart-large-mnli")

# Active chat sessions: {session_id: [ {role, msg, ip, timestamp} ]}
SESSIONS = {}

# Labels we want AI to detect (can be expanded)
SENSITIVE_LABELS = ["terrorism", "blackmail", "national security threat"]

# Storage space API to log flagged chats
STORAGE_API = "https://mike23415-storage.hf.space/api/flag"

def flag_if_sensitive(text, ip, session_id, role):
    # AI checks if the message matches sensitive labels
    result = classifier(text, SENSITIVE_LABELS)
    scores = dict(zip(result["labels"], result["scores"]))

    # Threshold to consider a message flagged
    for label, score in scores.items():
        if score > 0.8:
            print(f"⚠️ FLAGGED: {label} with score {score}")
            payload = {
                "ip": ip,
                "session_id": session_id,
                "timestamp": time.time(),
                "role": role,
                "message": text,
                "label": label,
                "score": score
            }
            # Send to storage space for later review
            try:
                requests.post(STORAGE_API, json=payload, timeout=3)
            except Exception as e:
                print("Failed to send flag:", e)
            break

@sock.route('/ws/<session_id>')
def chat(ws, session_id):
    ip = request.remote_addr or "unknown"
    if session_id not in SESSIONS:
        SESSIONS[session_id] = []

    join_index = sum(1 for msg in SESSIONS[session_id] if msg["role"].startswith("Receiver")) + 1
    role = "Sender" if len(SESSIONS[session_id]) == 0 else f"Receiver {join_index}"

    try:
        while True:
            msg = ws.receive()
            if msg is None:
                break
            entry = {
                "role": role,
                "msg": msg,
                "ip": ip,
                "timestamp": time.time()
            }
            SESSIONS[session_id].append(entry)

            # AI flagging in background
            flag_if_sensitive(msg, ip, session_id, role)

            # Broadcast to all participants
            for conn in SESSIONS[session_id]:
                try:
                    ws.send(f"{role}: {msg}")
                except:
                    continue
    except:
        pass

@app.route("/")
def root():
    return "Real-time AI chat backend is running."

if __name__ == "__main__":
    app.run(host="0.0.0.0", port=7860)