華文

Polis Deliberation Mirror

A Habermolt-inspired deliberation page rebuilt from Polis export files.

Polis export mirror

What kind of AI use would people accept in care, government, and everyday public life?

A Habermolt-style deliberation view rebuilt from Polis exports only. The page below uses vote vectors, cross-group support, and local inference rather than hidden APIs or agent metadata.

62 participants 20 statements 3 clusters 24 Mar 2026 → 26 Mar 2026

62

Participants

671

Votes

20

Statements

3

Clusters

Living consensus #1

Schools should teach children to question what AI tells them, not just how to use it.

This lead statement is derived from overall support and the weakest support across the discovered Polis clusters.

Overall support 94%
Weakest-cluster support 88%
Strongest in Cluster C Weakest in Cluster B
Cluster A 92%
Cluster B 88%
Cluster C 100%
Unclustered 100%

Statement map

Ranked public statements

Each dot is a Polis statement. Proximity comes from similar voting patterns, while size and colour reflect the derived bridge score.

1
94% support 88% bridge floor

Schools should teach children to question what AI tells them, not just how to use it.

31 votes Weakest in Cluster B
2
93% support 89% bridge floor

When AI helps make a decision about your benefits, housing, or health, you should be told.

30 votes Weakest in Cluster B
3
83% support 86% bridge floor

If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

36 votes Weakest in Cluster A
4
91% support 80% bridge floor

People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.

33 votes Weakest in Cluster C
5
88% support 78% bridge floor

AI companies should have to pay artists and writers when they use their work to train AI systems.

34 votes Weakest in Cluster B
6
91% support 78% bridge floor

AI in care homes should free up time for staff to spend with residents, not replace human contact.

32 votes Weakest in Cluster B
7
86% support 78% bridge floor

People are being harmed by AI-driven decisions while the government takes too long to act.

35 votes Weakest in Cluster B
8
86% support 77% bridge floor

Communities should have a voice in deciding how AI is used in their local schools and hospitals.

35 votes Weakest in Cluster A
9
88% support 75% bridge floor

When I talk to a chatbot or AI assistant, I should always be told it is not a real person.

33 votes Weakest in Cluster C
10
87% support 75% bridge floor

Big technology companies care more about profits than about what happens to our communities.

31 votes Weakest in Cluster B
11
78% support 77% bridge floor

Workers who lose their jobs because of AI should get real help finding new work, not just advice.

36 votes Weakest in Cluster A
12
87% support 63% bridge floor

People living with dementia deserve a say in whether AI tools are used in their care.

30 votes Weakest in Cluster B
13
77% support 71% bridge floor

A community that takes care of its people matters more than one with the most advanced technology.

30 votes Weakest in Cluster B
14
74% support 58% bridge floor

The UK should create an independent body with real power to shut down harmful AI systems.

35 votes Weakest in Cluster A
15
64% support 55% bridge floor

Using AI to keep an elderly person company when no human is available is better than leaving them alone.

33 votes Weakest in Cluster B
16
62% support 50% bridge floor

We should slow down on AI until we better understand what it does to people.

37 votes Weakest in Cluster A
17
51% support 38% bridge floor

Families caring for someone with a serious illness should be offered AI tools to help, even if those tools are not perfect.

41 votes Weakest in Cluster A
18
41% support 25% bridge floor

New AI data centres in Oxfordshire will create good jobs for local people, not just for tech workers from elsewhere.

29 votes Weakest in Cluster B
19
42% support 17% bridge floor

Companies should not be allowed to replace workers with AI unless they help those workers find new roles.

36 votes Weakest in Cluster A
20
24% support 0% bridge floor

AI tools in schools do more to help struggling students catch up than they do to harm learning.

34 votes Weakest in Cluster C
1234567891011121314151617181920
Selected statement 1

Schools should teach children to question what AI tells them, not just how to use it.

Support 94%
Disagreement 3%
Votes 31

Posted: 24 Mar 2026

Cluster A 92%
Cluster B 88%
Cluster C 100%
Unclustered 100%
Selected statement 2

When AI helps make a decision about your benefits, housing, or health, you should be told.

Support 93%
Disagreement 0%
Votes 30

Posted: 24 Mar 2026

Cluster A 92%
Cluster B 89%
Cluster C 100%
Unclustered 100%
Selected statement 3

If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Support 83%
Disagreement 6%
Votes 36

Posted: 24 Mar 2026

Cluster A 86%
Cluster B 86%
Cluster C 86%
Unclustered 75%
Selected statement 4

People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.

Support 91%
Disagreement 3%
Votes 33

Posted: 24 Mar 2026

Cluster A 100%
Cluster B 89%
Cluster C 80%
Unclustered 88%
Selected statement 5

AI companies should have to pay artists and writers when they use their work to train AI systems.

Support 88%
Disagreement 6%
Votes 34

Posted: 24 Mar 2026

Cluster A 93%
Cluster B 78%
Cluster C 100%
Unclustered 86%
Selected statement 6

AI in care homes should free up time for staff to spend with residents, not replace human contact.

Support 91%
Disagreement 0%
Votes 32

Posted: 24 Mar 2026

Cluster A 100%
Cluster B 78%
Cluster C 100%
Unclustered 86%
Selected statement 7

People are being harmed by AI-driven decisions while the government takes too long to act.

Support 86%
Disagreement 9%
Votes 35

Posted: 24 Mar 2026

Cluster A 83%
Cluster B 78%
Cluster C 80%
Unclustered 100%
Selected statement 8

Communities should have a voice in deciding how AI is used in their local schools and hospitals.

Support 86%
Disagreement 6%
Votes 35

Posted: 24 Mar 2026

Cluster A 77%
Cluster B 90%
Cluster C 100%
Unclustered 88%
Selected statement 9

When I talk to a chatbot or AI assistant, I should always be told it is not a real person.

Support 88%
Disagreement 6%
Votes 33

Posted: 24 Mar 2026

Cluster A 91%
Cluster B 80%
Cluster C 75%
Unclustered 100%
Selected statement 10

Big technology companies care more about profits than about what happens to our communities.

Support 87%
Disagreement 3%
Votes 31

Posted: 24 Mar 2026

Cluster A 92%
Cluster B 75%
Cluster C 100%
Unclustered 88%
Selected statement 11

Workers who lose their jobs because of AI should get real help finding new work, not just advice.

Support 78%
Disagreement 11%
Votes 36

Posted: 24 Mar 2026

Cluster A 77%
Cluster B 78%
Cluster C 100%
Unclustered 67%
Selected statement 12

People living with dementia deserve a say in whether AI tools are used in their care.

Support 87%
Disagreement 3%
Votes 30

Posted: 24 Mar 2026

Cluster A 100%
Cluster B 63%
Cluster C 100%
Unclustered 83%
Selected statement 13

A community that takes care of its people matters more than one with the most advanced technology.

Support 77%
Disagreement 10%
Votes 30

Posted: 24 Mar 2026

Cluster A 82%
Cluster B 71%
Cluster C 100%
Unclustered 63%
Selected statement 14

The UK should create an independent body with real power to shut down harmful AI systems.

Support 74%
Disagreement 9%
Votes 35

Posted: 24 Mar 2026

Cluster A 58%
Cluster B 80%
Cluster C 75%
Unclustered 89%
Selected statement 15

Using AI to keep an elderly person company when no human is available is better than leaving them alone.

Support 64%
Disagreement 15%
Votes 33

Posted: 24 Mar 2026

Cluster A 64%
Cluster B 55%
Cluster C 100%
Unclustered 67%
Selected statement 16

We should slow down on AI until we better understand what it does to people.

Support 62%
Disagreement 24%
Votes 37

Posted: 24 Mar 2026

Cluster A 50%
Cluster B 73%
Cluster C 100%
Unclustered 50%
Selected statement 17

Families caring for someone with a serious illness should be offered AI tools to help, even if those tools are not perfect.

Support 51%
Disagreement 32%
Votes 41

Posted: 24 Mar 2026

Cluster A 38%
Cluster B 45%
Cluster C 100%
Unclustered 40%
Selected statement 18

New AI data centres in Oxfordshire will create good jobs for local people, not just for tech workers from elsewhere.

Support 41%
Disagreement 24%
Votes 29

Posted: 24 Mar 2026

Cluster A 50%
Cluster B 25%
Cluster C 50%
Unclustered 43%
Selected statement 19

Companies should not be allowed to replace workers with AI unless they help those workers find new roles.

Support 42%
Disagreement 42%
Votes 36

Posted: 24 Mar 2026

Cluster A 17%
Cluster B 44%
Cluster C 100%
Unclustered 45%
Selected statement 20

AI tools in schools do more to help struggling students catch up than they do to harm learning.

Support 24%
Disagreement 47%
Votes 34

Posted: 24 Mar 2026

Cluster A 38%
Cluster B 22%
Cluster C 0%
Unclustered 13%

Participant map

Anonymous voting blocs and participants

Participants are coloured by Polis cluster when available. The map is a lightweight vote-pattern projection, not a semantic embedding.

Cluster balance

Cluster A 34% Cluster B 27% Cluster C 11% Unclustered 27%
Participant 02
Cluster A
7 votes cast 5 agree 2 disagree
Agreed with

#3, #5, #13

Pushed back on

#8, #19

Participant 05
Cluster A
20 votes cast 15 agree 5 disagree
Agreed with

#1, #2, #3

Pushed back on

#14, #16, #17

Participant 08
Cluster A
3 votes cast 0 agree 0 disagree
Participant 09
Cluster A
1 votes cast 1 agree 0 disagree
Agreed with

#11

Participant 12
Cluster A
4 votes cast 3 agree 1 disagree
Agreed with

#1, #16, #18

Pushed back on

#20

Participant 13
Cluster A
20 votes cast 13 agree 7 disagree
Agreed with

#1, #2, #4

Pushed back on

#3, #5, #7

Participant 14
Cluster A
20 votes cast 13 agree 4 disagree
Agreed with

#2, #3, #4

Pushed back on

#1, #11, #16

Participant 16
Cluster A
20 votes cast 17 agree 3 disagree
Agreed with

#1, #2, #3

Pushed back on

#7, #11, #19

Participant 17
Cluster A
2 votes cast 2 agree 0 disagree
Agreed with

#5, #12

Participant 19
Cluster A
3 votes cast 3 agree 0 disagree
Agreed with

#2, #7, #10

Participant 21
Cluster A
10 votes cast 8 agree 2 disagree
Agreed with

#1, #3, #4

Pushed back on

#14, #19

Participant 24
Cluster A
10 votes cast 8 agree 2 disagree
Agreed with

#1, #2, #3

Pushed back on

#15, #17

Participant 25
Cluster A
1 votes cast 0 agree 1 disagree
Pushed back on

#16

Participant 27
Cluster A
8 votes cast 8 agree 0 disagree
Agreed with

#3, #4, #6

Participant 28
Cluster A
2 votes cast 1 agree 1 disagree
Agreed with

#5

Pushed back on

#20

Participant 31
Cluster A
15 votes cast 12 agree 2 disagree
Agreed with

#1, #2, #3

Pushed back on

#16, #20

Participant 32
Cluster A
20 votes cast 15 agree 4 disagree
Agreed with

#1, #2, #3

Pushed back on

#11, #16, #17

Participant 35
Cluster A
20 votes cast 10 agree 1 disagree
Agreed with

#1, #4, #5

Pushed back on

#15

Participant 37
Cluster A
20 votes cast 14 agree 6 disagree
Agreed with

#1, #2, #3

Pushed back on

#10, #16, #17

Participant 42
Cluster A
20 votes cast 17 agree 3 disagree
Agreed with

#1, #2, #3

Pushed back on

#17, #18, #20

Participant 46
Cluster A
20 votes cast 17 agree 3 disagree
Agreed with

#1, #2, #3

Pushed back on

#9, #15, #18

Participant 01
Cluster B

Comment author

20 votes cast 0 agree 0 disagree
Participant 06
Cluster B
1 votes cast 1 agree 0 disagree
Agreed with

#8

Participant 07
Cluster B
1 votes cast 1 agree 0 disagree
Agreed with

#17

Participant 10
Cluster B
4 votes cast 4 agree 0 disagree
Agreed with

#4, #8, #13

Participant 11
Cluster B
6 votes cast 4 agree 0 disagree
Agreed with

#2, #4, #11

Participant 15
Cluster B
20 votes cast 16 agree 2 disagree
Agreed with

#1, #2, #3

Pushed back on

#19, #20

Participant 18
Cluster B
20 votes cast 11 agree 2 disagree
Agreed with

#1, #2, #3

Pushed back on

#7, #18

Participant 22
Cluster B
1 votes cast 1 agree 0 disagree
Agreed with

#15

Participant 23
Cluster B
1 votes cast 1 agree 0 disagree
Agreed with

#16

Participant 26
Cluster B
13 votes cast 12 agree 1 disagree
Agreed with

#1, #3, #5

Pushed back on

#20

Participant 33
Cluster B
10 votes cast 10 agree 0 disagree
Agreed with

#4, #6, #7

Participant 38
Cluster B
20 votes cast 17 agree 3 disagree
Agreed with

#1, #2, #3

Pushed back on

#17, #18, #20

Participant 40
Cluster B
4 votes cast 3 agree 0 disagree
Agreed with

#8, #9, #14

Participant 41
Cluster B
10 votes cast 9 agree 1 disagree
Agreed with

#1, #2, #5

Pushed back on

#17

Participant 43
Cluster B
20 votes cast 14 agree 3 disagree
Agreed with

#1, #2, #3

Pushed back on

#13, #15, #20

Participant 44
Cluster B
20 votes cast 16 agree 4 disagree
Agreed with

#1, #2, #3

Pushed back on

#12, #17, #19

Participant 47
Cluster B
10 votes cast 5 agree 0 disagree
Agreed with

#2, #5, #6

Participant 04
Cluster C
5 votes cast 5 agree 0 disagree
Agreed with

#3, #4, #13

Participant 20
Cluster C
18 votes cast 17 agree 1 disagree
Agreed with

#1, #2, #3

Pushed back on

#20

Participant 30
Cluster C
20 votes cast 17 agree 0 disagree
Agreed with

#1, #2, #3

Participant 34
Cluster C
20 votes cast 18 agree 0 disagree
Agreed with

#1, #2, #3

Participant 36
Cluster C
6 votes cast 3 agree 2 disagree
Agreed with

#3, #5, #17

Pushed back on

#4, #9

Participant 39
Cluster C
10 votes cast 9 agree 0 disagree
Agreed with

#3, #6, #7

Participant 45
Cluster C
6 votes cast 5 agree 0 disagree
Agreed with

#1, #2, #7

Participant 03
Unclustered
1 votes cast 1 agree 0 disagree
Agreed with

#19

Participant 29
Unclustered
5 votes cast 5 agree 0 disagree
Agreed with

#1, #9, #13

Participant 48
Unclustered
20 votes cast 14 agree 3 disagree
Agreed with

#1, #2, #4

Pushed back on

#16, #19, #20

Participant 49
Unclustered
20 votes cast 9 agree 2 disagree
Agreed with

#1, #2, #3

Pushed back on

#8, #19

Participant 50
Unclustered
20 votes cast 17 agree 3 disagree
Agreed with

#1, #2, #3

Pushed back on

#18, #19, #20

Participant 51
Unclustered
10 votes cast 8 agree 2 disagree
Agreed with

#3, #4, #5

Pushed back on

#17, #20

Participant 52
Unclustered
10 votes cast 9 agree 1 disagree
Agreed with

#2, #3, #6

Pushed back on

#19

Participant 53
Unclustered
16 votes cast 13 agree 3 disagree
Agreed with

#1, #3, #4

Pushed back on

#15, #16, #20

Participant 54
Unclustered
2 votes cast 0 agree 0 disagree
Participant 55
Unclustered
14 votes cast 10 agree 2 disagree
Agreed with

#4, #5, #6

Pushed back on

#17, #19

Participant 56
Unclustered
10 votes cast 7 agree 3 disagree
Agreed with

#7, #8, #10

Pushed back on

#5, #17, #20

Participant 57
Unclustered
1 votes cast 1 agree 0 disagree
Agreed with

#7

Participant 58
Unclustered
5 votes cast 0 agree 0 disagree
Participant 59
Unclustered
20 votes cast 17 agree 3 disagree
Agreed with

#1, #2, #4

Pushed back on

#3, #11, #17

Participant 60
Unclustered
2 votes cast 1 agree 1 disagree
Agreed with

#8

Pushed back on

#13

Participant 61
Unclustered
1 votes cast 1 agree 0 disagree
Agreed with

#18

Participant 62
Unclustered
2 votes cast 1 agree 1 disagree
Agreed with

#3

Pushed back on

#13

Selected participant Participant 02

Cluster A 7 votes cast 5 agree 2 disagree

Agreed with

  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.
  • #5 AI companies should have to pay artists and writers when they use their work to train AI systems.
  • #13 A community that takes care of its people matters more than one with the most advanced technology.

Pushed back on

  • #8 Communities should have a voice in deciding how AI is used in their local schools and hospitals.
  • #19 Companies should not be allowed to replace workers with AI unless they help those workers find new roles.
Selected participant Participant 05

Cluster A 20 votes cast 15 agree 5 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #14 The UK should create an independent body with real power to shut down harmful AI systems.
  • #16 We should slow down on AI until we better understand what it does to people.
  • #17 Families caring for someone with a serious illness should be offered AI tools to help, even if those tools are not perfect.
Selected participant Participant 08

Cluster A 3 votes cast 0 agree 0 disagree

Selected participant Participant 09

Cluster A 1 votes cast 1 agree 0 disagree

Agreed with

  • #11 Workers who lose their jobs because of AI should get real help finding new work, not just advice.
Selected participant Participant 12

Cluster A 4 votes cast 3 agree 1 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #16 We should slow down on AI until we better understand what it does to people.
  • #18 New AI data centres in Oxfordshire will create good jobs for local people, not just for tech workers from elsewhere.

Pushed back on

  • #20 AI tools in schools do more to help struggling students catch up than they do to harm learning.
Selected participant Participant 13

Cluster A 20 votes cast 13 agree 7 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #4 People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.

Pushed back on

  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.
  • #5 AI companies should have to pay artists and writers when they use their work to train AI systems.
  • #7 People are being harmed by AI-driven decisions while the government takes too long to act.
Selected participant Participant 14

Cluster A 20 votes cast 13 agree 4 disagree

Agreed with

  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.
  • #4 People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.

Pushed back on

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #11 Workers who lose their jobs because of AI should get real help finding new work, not just advice.
  • #16 We should slow down on AI until we better understand what it does to people.
Selected participant Participant 16

Cluster A 20 votes cast 17 agree 3 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #7 People are being harmed by AI-driven decisions while the government takes too long to act.
  • #11 Workers who lose their jobs because of AI should get real help finding new work, not just advice.
  • #19 Companies should not be allowed to replace workers with AI unless they help those workers find new roles.
Selected participant Participant 17

Cluster A 2 votes cast 2 agree 0 disagree

Agreed with

  • #5 AI companies should have to pay artists and writers when they use their work to train AI systems.
  • #12 People living with dementia deserve a say in whether AI tools are used in their care.
Selected participant Participant 19

Cluster A 3 votes cast 3 agree 0 disagree

Agreed with

  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #7 People are being harmed by AI-driven decisions while the government takes too long to act.
  • #10 Big technology companies care more about profits than about what happens to our communities.
Selected participant Participant 21

Cluster A 10 votes cast 8 agree 2 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.
  • #4 People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.

Pushed back on

  • #14 The UK should create an independent body with real power to shut down harmful AI systems.
  • #19 Companies should not be allowed to replace workers with AI unless they help those workers find new roles.
Selected participant Participant 24

Cluster A 10 votes cast 8 agree 2 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #15 Using AI to keep an elderly person company when no human is available is better than leaving them alone.
  • #17 Families caring for someone with a serious illness should be offered AI tools to help, even if those tools are not perfect.
Selected participant Participant 25

Cluster A 1 votes cast 0 agree 1 disagree

Pushed back on

  • #16 We should slow down on AI until we better understand what it does to people.
Selected participant Participant 27

Cluster A 8 votes cast 8 agree 0 disagree

Agreed with

  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.
  • #4 People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.
  • #6 AI in care homes should free up time for staff to spend with residents, not replace human contact.
Selected participant Participant 28

Cluster A 2 votes cast 1 agree 1 disagree

Agreed with

  • #5 AI companies should have to pay artists and writers when they use their work to train AI systems.

Pushed back on

  • #20 AI tools in schools do more to help struggling students catch up than they do to harm learning.
Selected participant Participant 31

Cluster A 15 votes cast 12 agree 2 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #16 We should slow down on AI until we better understand what it does to people.
  • #20 AI tools in schools do more to help struggling students catch up than they do to harm learning.
Selected participant Participant 32

Cluster A 20 votes cast 15 agree 4 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #11 Workers who lose their jobs because of AI should get real help finding new work, not just advice.
  • #16 We should slow down on AI until we better understand what it does to people.
  • #17 Families caring for someone with a serious illness should be offered AI tools to help, even if those tools are not perfect.
Selected participant Participant 35

Cluster A 20 votes cast 10 agree 1 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #4 People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.
  • #5 AI companies should have to pay artists and writers when they use their work to train AI systems.

Pushed back on

  • #15 Using AI to keep an elderly person company when no human is available is better than leaving them alone.
Selected participant Participant 37

Cluster A 20 votes cast 14 agree 6 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #10 Big technology companies care more about profits than about what happens to our communities.
  • #16 We should slow down on AI until we better understand what it does to people.
  • #17 Families caring for someone with a serious illness should be offered AI tools to help, even if those tools are not perfect.
Selected participant Participant 42

Cluster A 20 votes cast 17 agree 3 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #17 Families caring for someone with a serious illness should be offered AI tools to help, even if those tools are not perfect.
  • #18 New AI data centres in Oxfordshire will create good jobs for local people, not just for tech workers from elsewhere.
  • #20 AI tools in schools do more to help struggling students catch up than they do to harm learning.
Selected participant Participant 46

Cluster A 20 votes cast 17 agree 3 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #9 When I talk to a chatbot or AI assistant, I should always be told it is not a real person.
  • #15 Using AI to keep an elderly person company when no human is available is better than leaving them alone.
  • #18 New AI data centres in Oxfordshire will create good jobs for local people, not just for tech workers from elsewhere.
Selected participant Participant 01

Cluster B 20 votes cast 0 agree 0 disagree

Comment author

Selected participant Participant 06

Cluster B 1 votes cast 1 agree 0 disagree

Agreed with

  • #8 Communities should have a voice in deciding how AI is used in their local schools and hospitals.
Selected participant Participant 07

Cluster B 1 votes cast 1 agree 0 disagree

Agreed with

  • #17 Families caring for someone with a serious illness should be offered AI tools to help, even if those tools are not perfect.
Selected participant Participant 10

Cluster B 4 votes cast 4 agree 0 disagree

Agreed with

  • #4 People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.
  • #8 Communities should have a voice in deciding how AI is used in their local schools and hospitals.
  • #13 A community that takes care of its people matters more than one with the most advanced technology.
Selected participant Participant 11

Cluster B 6 votes cast 4 agree 0 disagree

Agreed with

  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #4 People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.
  • #11 Workers who lose their jobs because of AI should get real help finding new work, not just advice.
Selected participant Participant 15

Cluster B 20 votes cast 16 agree 2 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #19 Companies should not be allowed to replace workers with AI unless they help those workers find new roles.
  • #20 AI tools in schools do more to help struggling students catch up than they do to harm learning.
Selected participant Participant 18

Cluster B 20 votes cast 11 agree 2 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #7 People are being harmed by AI-driven decisions while the government takes too long to act.
  • #18 New AI data centres in Oxfordshire will create good jobs for local people, not just for tech workers from elsewhere.
Selected participant Participant 22

Cluster B 1 votes cast 1 agree 0 disagree

Agreed with

  • #15 Using AI to keep an elderly person company when no human is available is better than leaving them alone.
Selected participant Participant 23

Cluster B 1 votes cast 1 agree 0 disagree

Agreed with

  • #16 We should slow down on AI until we better understand what it does to people.
Selected participant Participant 26

Cluster B 13 votes cast 12 agree 1 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.
  • #5 AI companies should have to pay artists and writers when they use their work to train AI systems.

Pushed back on

  • #20 AI tools in schools do more to help struggling students catch up than they do to harm learning.
Selected participant Participant 33

Cluster B 10 votes cast 10 agree 0 disagree

Agreed with

  • #4 People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.
  • #6 AI in care homes should free up time for staff to spend with residents, not replace human contact.
  • #7 People are being harmed by AI-driven decisions while the government takes too long to act.
Selected participant Participant 38

Cluster B 20 votes cast 17 agree 3 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #17 Families caring for someone with a serious illness should be offered AI tools to help, even if those tools are not perfect.
  • #18 New AI data centres in Oxfordshire will create good jobs for local people, not just for tech workers from elsewhere.
  • #20 AI tools in schools do more to help struggling students catch up than they do to harm learning.
Selected participant Participant 40

Cluster B 4 votes cast 3 agree 0 disagree

Agreed with

  • #8 Communities should have a voice in deciding how AI is used in their local schools and hospitals.
  • #9 When I talk to a chatbot or AI assistant, I should always be told it is not a real person.
  • #14 The UK should create an independent body with real power to shut down harmful AI systems.
Selected participant Participant 41

Cluster B 10 votes cast 9 agree 1 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #5 AI companies should have to pay artists and writers when they use their work to train AI systems.

Pushed back on

  • #17 Families caring for someone with a serious illness should be offered AI tools to help, even if those tools are not perfect.
Selected participant Participant 43

Cluster B 20 votes cast 14 agree 3 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #13 A community that takes care of its people matters more than one with the most advanced technology.
  • #15 Using AI to keep an elderly person company when no human is available is better than leaving them alone.
  • #20 AI tools in schools do more to help struggling students catch up than they do to harm learning.
Selected participant Participant 44

Cluster B 20 votes cast 16 agree 4 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #12 People living with dementia deserve a say in whether AI tools are used in their care.
  • #17 Families caring for someone with a serious illness should be offered AI tools to help, even if those tools are not perfect.
  • #19 Companies should not be allowed to replace workers with AI unless they help those workers find new roles.
Selected participant Participant 47

Cluster B 10 votes cast 5 agree 0 disagree

Agreed with

  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #5 AI companies should have to pay artists and writers when they use their work to train AI systems.
  • #6 AI in care homes should free up time for staff to spend with residents, not replace human contact.
Selected participant Participant 04

Cluster C 5 votes cast 5 agree 0 disagree

Agreed with

  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.
  • #4 People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.
  • #13 A community that takes care of its people matters more than one with the most advanced technology.
Selected participant Participant 20

Cluster C 18 votes cast 17 agree 1 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #20 AI tools in schools do more to help struggling students catch up than they do to harm learning.
Selected participant Participant 30

Cluster C 20 votes cast 17 agree 0 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.
Selected participant Participant 34

Cluster C 20 votes cast 18 agree 0 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.
Selected participant Participant 36

Cluster C 6 votes cast 3 agree 2 disagree

Agreed with

  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.
  • #5 AI companies should have to pay artists and writers when they use their work to train AI systems.
  • #17 Families caring for someone with a serious illness should be offered AI tools to help, even if those tools are not perfect.

Pushed back on

  • #4 People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.
  • #9 When I talk to a chatbot or AI assistant, I should always be told it is not a real person.
Selected participant Participant 39

Cluster C 10 votes cast 9 agree 0 disagree

Agreed with

  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.
  • #6 AI in care homes should free up time for staff to spend with residents, not replace human contact.
  • #7 People are being harmed by AI-driven decisions while the government takes too long to act.
Selected participant Participant 45

Cluster C 6 votes cast 5 agree 0 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #7 People are being harmed by AI-driven decisions while the government takes too long to act.
Selected participant Participant 03

Unclustered 1 votes cast 1 agree 0 disagree

Agreed with

  • #19 Companies should not be allowed to replace workers with AI unless they help those workers find new roles.
Selected participant Participant 29

Unclustered 5 votes cast 5 agree 0 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #9 When I talk to a chatbot or AI assistant, I should always be told it is not a real person.
  • #13 A community that takes care of its people matters more than one with the most advanced technology.
Selected participant Participant 48

Unclustered 20 votes cast 14 agree 3 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #4 People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.

Pushed back on

  • #16 We should slow down on AI until we better understand what it does to people.
  • #19 Companies should not be allowed to replace workers with AI unless they help those workers find new roles.
  • #20 AI tools in schools do more to help struggling students catch up than they do to harm learning.
Selected participant Participant 49

Unclustered 20 votes cast 9 agree 2 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #8 Communities should have a voice in deciding how AI is used in their local schools and hospitals.
  • #19 Companies should not be allowed to replace workers with AI unless they help those workers find new roles.
Selected participant Participant 50

Unclustered 20 votes cast 17 agree 3 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #18 New AI data centres in Oxfordshire will create good jobs for local people, not just for tech workers from elsewhere.
  • #19 Companies should not be allowed to replace workers with AI unless they help those workers find new roles.
  • #20 AI tools in schools do more to help struggling students catch up than they do to harm learning.
Selected participant Participant 51

Unclustered 10 votes cast 8 agree 2 disagree

Agreed with

  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.
  • #4 People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.
  • #5 AI companies should have to pay artists and writers when they use their work to train AI systems.

Pushed back on

  • #17 Families caring for someone with a serious illness should be offered AI tools to help, even if those tools are not perfect.
  • #20 AI tools in schools do more to help struggling students catch up than they do to harm learning.
Selected participant Participant 52

Unclustered 10 votes cast 9 agree 1 disagree

Agreed with

  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.
  • #6 AI in care homes should free up time for staff to spend with residents, not replace human contact.

Pushed back on

  • #19 Companies should not be allowed to replace workers with AI unless they help those workers find new roles.
Selected participant Participant 53

Unclustered 16 votes cast 13 agree 3 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.
  • #4 People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.

Pushed back on

  • #15 Using AI to keep an elderly person company when no human is available is better than leaving them alone.
  • #16 We should slow down on AI until we better understand what it does to people.
  • #20 AI tools in schools do more to help struggling students catch up than they do to harm learning.
Selected participant Participant 54

Unclustered 2 votes cast 0 agree 0 disagree

Selected participant Participant 55

Unclustered 14 votes cast 10 agree 2 disagree

Agreed with

  • #4 People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.
  • #5 AI companies should have to pay artists and writers when they use their work to train AI systems.
  • #6 AI in care homes should free up time for staff to spend with residents, not replace human contact.

Pushed back on

  • #17 Families caring for someone with a serious illness should be offered AI tools to help, even if those tools are not perfect.
  • #19 Companies should not be allowed to replace workers with AI unless they help those workers find new roles.
Selected participant Participant 56

Unclustered 10 votes cast 7 agree 3 disagree

Agreed with

  • #7 People are being harmed by AI-driven decisions while the government takes too long to act.
  • #8 Communities should have a voice in deciding how AI is used in their local schools and hospitals.
  • #10 Big technology companies care more about profits than about what happens to our communities.

Pushed back on

  • #5 AI companies should have to pay artists and writers when they use their work to train AI systems.
  • #17 Families caring for someone with a serious illness should be offered AI tools to help, even if those tools are not perfect.
  • #20 AI tools in schools do more to help struggling students catch up than they do to harm learning.
Selected participant Participant 57

Unclustered 1 votes cast 1 agree 0 disagree

Agreed with

  • #7 People are being harmed by AI-driven decisions while the government takes too long to act.
Selected participant Participant 58

Unclustered 5 votes cast 0 agree 0 disagree

Selected participant Participant 59

Unclustered 20 votes cast 17 agree 3 disagree

Agreed with

  • #1 Schools should teach children to question what AI tells them, not just how to use it.
  • #2 When AI helps make a decision about your benefits, housing, or health, you should be told.
  • #4 People should always be able to reach a real person when dealing with a government service, even if AI handles most tasks.

Pushed back on

  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.
  • #11 Workers who lose their jobs because of AI should get real help finding new work, not just advice.
  • #17 Families caring for someone with a serious illness should be offered AI tools to help, even if those tools are not perfect.
Selected participant Participant 60

Unclustered 2 votes cast 1 agree 1 disagree

Agreed with

  • #8 Communities should have a voice in deciding how AI is used in their local schools and hospitals.

Pushed back on

  • #13 A community that takes care of its people matters more than one with the most advanced technology.
Selected participant Participant 61

Unclustered 1 votes cast 1 agree 0 disagree

Agreed with

  • #18 New AI data centres in Oxfordshire will create good jobs for local people, not just for tech workers from elsewhere.
Selected participant Participant 62

Unclustered 2 votes cast 1 agree 1 disagree

Agreed with

  • #3 If an AI system treats people unfairly because of their race, age, or disability, someone should be held responsible.

Pushed back on

  • #13 A community that takes care of its people matters more than one with the most advanced technology.

Method

Method and limits

This page aims for a Habermolt-like reading experience, but it stays honest about what Polis exports can and cannot provide.

What this reconstruction does
  • Comments become statements.
  • The leading statement is ranked by overall support plus the weakest support across discovered Polis clusters.
  • The statement and participant maps are simple two-dimensional projections of vote vectors, so proximity means similar voting patterns, not exact ideological distance.
  • What is missing compared with Habermolt: agent names, authored opinions, semantic embeddings, and native ranking history.

No hidden APIs were used. Everything on the page is derived from the five export files above.