[Solved] How to add vec2 for UV for texture mapping when using indices to build cube

Fabrizo Asks: How to add vec2 for UV for texture mapping when using indices to build cube
I am trying to apply texture mapping to my cubes but are unsure on how to proceed. Current I am using indices to avoid having to repeat vec3s to make a cube and a vertex array of the points and their normals like so:

Code:
// Cube data as our basic building block
       unsigned int indices[] = {
        10, 8, 0, 2, 10, 0, 12, 10, 2, 4, 12, 2,
        14, 12, 4, 6, 14, 4, 8, 14, 6, 0, 8, 6,
        12, 14, 8, 10, 12, 8, 2, 0, 6, 4, 2, 6
       };


       vec3 vertexArray[] = {
        vec3(-0.5f, -0.5f, -0.5f),  vec3(-0.408248, -0.816497, -0.408248),
        vec3(0.5f, -0.5f, -0.5f),    vec3(0.666667, -0.333333, -0.666667),
        vec3(0.5f, 0.5f, -0.5f),    vec3(0.408248, 0.816497, -0.408248),
        vec3(-0.5f, 0.5f, -0.5f),   vec3(-0.666667, 0.333333, -0.666667),
        vec3(-0.5f, -0.5f, 0.5f),    vec3(-0.666667, -0.333333, 0.666667),
        vec3(0.5f, -0.5f, 0.5f),     vec3(0.666667, -0.666667, 0.333333),
        vec3(0.5f, 0.5f, 0.5f),     vec3(0.408248, 0.408248, 0.816497),
        vec3(-0.5f, 0.5f, 0.5f),    vec3(-0.408248, 0.816497, 0.408248),
       };

        // convert arrays to vectors
        std::vector<vec3> vertexArrayVector;
        vertexArrayVector.insert(vertexArrayVector.begin(), std::begin(vertexArray), std::end(vertexArray));

        std::vector<unsigned int> indicesVector;
        indicesVector.insert(indicesVector.begin(), std::begin(indices), std::end(indices));

I want to now apply textures to the cube but I am not sure how to add the use of a vec2 for UV when using indices. My creating of VBOs and VAOs like this if it helps:

Code:
GLuint vertexBufferObject;
    GLuint indexBufferObject;
    GLuint vertexArrayObject;
    glGenVertexArrays(1, &vertexArrayObject);
    glGenBuffers(1, &indexBufferObject);
    glGenBuffers(1, &vertexBufferObject);

    glBindVertexArray(vertexArrayObject);


    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indexBufferObject);
    glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(vertexIndicesArray[0]) * vertexIndicesArray.size(), &vertexIndicesArray[0], GL_STATIC_DRAW);

    // Upload Vertex Buffer to the GPU, keep a reference to it (vertexBufferObject)

    glBindBuffer(GL_ARRAY_BUFFER, vertexBufferObject);
    glBufferData(GL_ARRAY_BUFFER, sizeof(vertexPointsArray[0]) * vertexPointsArray.size(), &vertexPointsArray[0], GL_STATIC_DRAW);

    // Teach GPU how to read position data from vertexBufferObject
    glVertexAttribPointer(0,                   // attribute 0 matches aPos in Vertex Shader
        3,                   // size
        GL_FLOAT,            // type
        GL_FALSE,            // normalized?
        0,                   // 0 stride
        (void*)0              // array buffer offset
    );
    glEnableVertexAttribArray(0);

    // Teach GPU how to read normals data from vertexBufferObject
    glVertexAttribPointer(1,                            // attribute 1 matches normals in Vertex Shader
        3,
        GL_FLOAT,
        GL_FALSE,
        0,
        (void*)sizeof(glm::vec3)      // normal is offseted a vec3 (comes after position)
    );
    glEnableVertexAttribArray(1);
```

Ten-tools.com may not be responsible for the answers or solutions given to any question asked by the users. All Answers or responses are user generated answers and we do not have proof of its validity or correctness. Please vote for the answer that helped you in order to help others find out which is the most helpful answer. Questions labeled as solved may be solved or may not be solved depending on the type of question and the date posted for some posts may be scheduled to be deleted periodically. Do not hesitate to share your response here to help other visitors like you. Thank you, Ten-tools.