SUPUESTO 2025 (1): PARTICIPACIÓN EN EL DISEÑO DE ACTUACIONES PARA INCORPORAR EL USO DE LA INTELIGENCIA ARTIFICIAL EN EL PROCESO DE E-A EN SU CENTRO

LA EDUCACIÓN ES LA CLAVE DEL PROGRESO (Mª MOLINER) 

AI is not only a technological tool, but also a driver for holistic development and lifelong learning.

1. Descripción de 3 actuaciones a desarrollar para maximizar el potencial de la IA para :

    a) creación de recursos didácticos. Development of educational resources.

Teachers play a key role in creating effective and stimulating educational materials for

students. AI, especially generative AI, offers tools and technologies that can enhance the

creation of these resources. Examples of these resources include:

- Automated content generators that enable the generation of educational materials

such as quizzes, exercises, activities, videos, simulations, games, multimedia

presentations, schedules, and learning scenarios. These resources can be adapted to

different learning styles and facilitate the understanding of complex concepts. As an

example, a text adapted to the needs of the students could be created automatically,

taking into account variables such as the number of characters, subject matter, font,

difficulty, inclusion of images, organisation into paragraphs, among other aspects, in

order to address specific content effectively. Ejemplo: creación de un escape room sobre evolución para 4º de ESO, creación de exámenes competenciales según las órdenes 754y 755/2022.

- Adaptation of content to different languages, making educational materials

accessible to all students from different linguistic and cultural backgrounds. As an

example, a teacher could use a machine translation system to translate an information

circular from the original language into the native language of the student and his or

her family. Ejemplo: pedirle a la IA que traduzca las normas del escape room al ucraniano a una alumna ucraniana recién llegada a España.

    b) perfeccionar los sistemas de evaluación. Assesment systems.

It is necessary for teachers to know how the systems or tools they are using in the classroom work, in order to be able to make informed and justified decisions. Some examples of the use of AI in

assessment are:

- Automated, immediate and personalised feedback. Available AI systems provide

specific and personalised feedback for each student. These systems allow teachers to

identify areas of improvement in their students and automatically offer suggestions

for activities so that students receive the attention and support they need to reach

their full potential. As an example, an online educational platform using AI can give

immediate feedback to students on their answers to given questions. If a student

makes a mistake, the system not only helps them identify the error, but also provides

a detailed explanation and suggestions for correcting it. Ejemplo: con la aplicación gratuita de Google Flubaroo, se pueden calificar tests (Google Forms) y obtener estadísticas, pero esto es algo automático, no genera un feed-back con observaciones. Para ello, Google ha puesto a disposición su IAg (Gemini), d emomento en inglés, para utilizar en Google Clasroom.

- Automated assessment using AI algorithms to correct tests, exams and assignments

quickly and efficiently. These systems can analyse written answers (multiple choice or

essays, for example), providing accurate results in a short time. As an example, a

teacher could use an automated assessment system to correct multiple-choice tests.

The system scans the student's answers, compares them with the correct answers and

automatically generates the corresponding scores. Ejemplo: Flubaroo.

    c) personalizar las experiencias de aprendizaje. Personalised learning.

Personalisation of learning is a pedagogical approach that seeks to adapt the educational

process to the individual needs of each learner. AI offers tools and techniques that allow

teachers to personalise learning more effectively. The following are some of the ways in which

teachers can use AI to personalise learning:

- Analysis of learner data. This involves 

collecting, processing and analysing data on student performance, 

learning preferences and 

other relevant student factors.

Predictive models can help teachers identify patterns and trends in student data,

enabling them to anticipate and adjust their pedagogical approach to meet individual

needs. As an example, a classroom learning management system would allow

teachers to collect data on each student's progress, automatically generate a report

and thus identify specific student difficulties in order to offer personalised support.

- Content recommendation systems. These systems use AI algorithms to suggest

specific educational resources and learning activities that are personalised, relevant

and appropriate to their individual needs. As an example, an online educational

platform using an AI-based recommendation system can create personalised learning

pathways that are aligned to the interests and skill level of the learner.

2. Medidas contempladas para garantizar:

    a)  un entorno de aprendizaje seguro, ético y responsable. A safe, ethical and responsable learning environment.

The top ten requirements for the good use of AI should contain, address or provide a solution to ethical dilemmas such as:

1. Appropriate and timely integration. AI must be used for the common good of

humanity and avoid harming or damaging people or the environment.

2. Transparency and awareness. AI systems must be transparent in their operation and

decisions, allowing people to understand how they work and why they make certain

decisions.

3. Fairness and non-discrimination. We must develop and use AI systems in a fair and

equitable manner, avoiding any discrimination or bias towards specific individuals or

groups.

4. Durability and security. AI systems must be robust and secure, protected against

attacks that could put individuals or society at risk.

5. Privacy and data protection. AI must respect people's privacy and protect their

personal data, ensuring its responsible and ethical use.

6. Human supervision. AI should be under human supervision, ensuring that people

retain control over its development and use, and that no autonomous decisions are

made that could negatively affect society.

7. Human compatibility. AI must be designed to be compatible with human values and

dignity, respecting people's autonomy, freedom, and privacy.

8. Promotion of social welfare. AI should be used to promote social welfare and

sustainable development, contributing to the solution of global problems such as

poverty, climate change and disease.

9. Collaborative learning. When implementing an AI system or tool, the participation of

different actors should be encouraged to ensure different perspectives and values.

10. Reflection and anticipation. There should be a continuous assessment of the ethical

and social impact of AI, anticipating possible future risks and challenges, and

promoting an open and transparent dialogue between all stakeholders.

PRACTICAL SUGGESTIONS FOR DEALING WITH ETHICS AND AI WITH YOUR STUDENTS

ALGORITHMS BIASES

It is a fact that AI algorithms can be affected by biases (social, racial, gender). Therefore, it is

important that both teachers and students are aware of this fact and work actively to mitigate

its impact on decision-making.

- Sample activity: analysis of case studies.

- Description: present students with real cases where algorithmic biases have had a

negative impact on society. For example, 

Genetic Data and Discrimination in Healthcare

Case: Algorithms trained on predominantly European genetic data have been used to predict disease risk or recommend treatments.
Impact: These models often perform poorly for individuals of non-European descent, potentially leading to misdiagnosis or lack of proper treatment for people from underrepresented genetic backgrounds.
Example: Polygenic risk scores (used for predicting diseases like breast cancer or heart disease) are far less accurate for African, Asian, or Indigenous populations.
Lesson: Lack of diversity in genetic datasets can lead to real-world health disparities.

DEEPFAKES

The creation of false content through deepfakes is a threat to the authenticity and integrity

of information. Teachers and students need to be taught to discern between truthful and false

information, and to develop analytical and critical thinking skills.

- Sample activity: workshop on creating and detecting deepfakes.

- Description: students will learn how to create deepfakes using online tools and will

then be challenged to identify these false contents in images, audios and/or videos

created by other classmates in genetics, ecology, vaccines, etc.

  • Learn how deepfakes are made – using accessible online tools, students will be introduced to basic techniques for creating synthetic media, such as:

    • AI-generated portraits of "scientists" or "patients"

    • Fake audio clips of genetic researchers discussing fabricated findings

    • Edited videos that simulate false discoveries in genetics

  • Create their own genetic deepfakes – in small groups, students will design a fake piece of media that could hypothetically be used to mislead an audience about a genetic topic (e.g., fake data about a new gene therapy or mutation).

  • Detect deepfakes made by peers – after the creation phase, teams will swap deepfakes and attempt to identify and explain what’s fake, how it was likely made, and why it might be dangerous or misleading.

  • Tools/Platforms (suggested):

    • This Person Does Not Exist (for fake images)

    • ElevenLabs or FakeYou (for voice cloning – free demo versions available)

    • Runway ML or D-ID (for video deepfake generation)

    • Deepware Scanner or Sensity AI (for detecting deepfakes)


    ⚠️ Ethical Note:
    Before starting the activity, emphasize responsible use of AI and the importance of truth and transparency in science. Remind students that the goal is to learn how to protect against misinformation, not to spread it.

  •     b) la privacidad de los datos del alumnado y del profesorado al usar herramientas de IA en el aula. Privacy of students and teacher´s data.

    Vulnerability of students' personal data. 

    In an AI-driven educational environment, the collection and handling of personal student data is inevitable. However, if this data is not properly managed, it could be exposed to privacy and security risks, compromising their future work or personal lives.

    Establish specific data protection policies for our students. 

    Educational institutions should implement a firm regulatory framework that addresses a variety of issues related to AI, including data privacy, algorithmic transparency, fairness and accountability. These policies should be a specific measure of general data protection policies.

    The protection of students' personal information is a fundamental right that must be

    guaranteed at all times. It is important that, in addition to ensuring policies and strategies in

    this sense, we work with them so that they are aware and responsible when it comes to

    handing over their personal data.

    - Activity: infographic on the safe use of data on social networks.

    - Description: hold a collaborative debate among students to make them aware that

    mobile devices and social networks are not just toys. Produce an infographic with the

    main conclusions of the debate.


    Comentarios

    Entradas populares de este blog

    ALCES: INSTRUCCIONES OPERATIVAS DE 27 DE MARZO DE 2023 DE LA UNIDAD DE ACCIÓN EDUCATIVA EXTERIOR PARA COMPLEMENTAR LA RESOLUCIÓN DE 6 DE MARZO DE 2023 SOBRE LAS ENSEÑANZAS DE LENGUA Y CULTURA ESPAÑOLAS. (27/03/2023)

    ORDEN EFP/678/22 DE EP

    CTEE: propuesta pedagógica INTEGRADA EN EL PEC: OF Y PCE: Resolución de 28 de septiembre de 2022, de la Secretaría de Estado de Educación