sábado, 21 de junio de 2014

MPI commands that will be used in this blog


  In this blog I will explain the basic commands mpi will use in this blog and video tutorials.


 MPI_Init( &argc, &argv );

this command we will be used in the initiation of our code.
this code is used to start parallel communication between processors.

MPI_Finalize();

this command will be used in the completion of our code
Terminates MPI execution environment



   MPI_Comm_size( MPI_COMM_WORLD, &nprocs );
 where nprocs is a integer.
This command saves the number of processors in nprocs.



   MPI_Comm_rank( MPI_COMM_WORLD, &myproc );
where myproc is a integer.
this command gives us a number to identify each processor, and will be different for each processor.



   MPI_Send(void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm)
where buf is initial address of send buffer
where count is number of elements in send buffer (nonnegative integer)
where datatype is datatype of each send buffer element.
where  dest is rank of destination, this is obtained by the command MPI_Comm_rank
where tag is message tag (integer), This is used to distinguish messages sent.
where comm is communicator, for these examples we used MPI_COMM_WORLD
This command allows us to communicate between processors.


 MPI_Recv(void *buf, int count, MPI_Datatype datatype, int source, int tag, MPI_Comm comm, MPI_Status *status)
where buf is initial address of receive buffer (choice)   OUTPUT
where count is maximum number of elements in receive buffer (nonnegative integer)
where datatype is datatype of each receive buffer element.
where source is rank of source this is obtained by the command MPI_Comm_rank
where tag is message tag, This is used to distinguish messages received
where status is status object. OUTPUT



MPI_Send and MPI_Recv is a blocking command

hellow MPI

this is the code what i ised in the hello word

#include<stdio.h>
#include <mpi.h>
int main(int argc, char *argv[])
{
   int nprocs, myproc;
   MPI_Init( &argc, &argv );
   MPI_Comm_size( MPI_COMM_WORLD, &nprocs );
   MPI_Comm_rank( MPI_COMM_WORLD, &myproc );
   printf("Hola soy %d  de %d \n\n",myproc,nprocs);
   MPI_Finalize();
   return 0;
}



how i can install MPI in my computer?

bien empesare a modo de la instaciòn basica desde cero.

tenemos esta interface

bien seguiremos los siguientes pasos asta donde nos llevan


primero descargamos el tar en la pagina oficial
http://www.mpich.org/downloads/


bien ahora procedemos a descomprimirlo
 tar -xzf mpich2-1.4.tar.gz
nos situamos despues en el directorio

cd mpi....
Antes creamos una carpeta de instalacion,
mkdir intalacion.

instalacmos fortram primero
sudo apt-get install gfortran 
bien ahora

./configure --prefix "/home/fablab/carlos/instalacion"

bien ahora un make :P
dicen que esto tardara bien ahora son las 4 y 20 veamos cuanto tarda.
 bien no tardo mucho son las 4 y 27 y ya esta.

make
sudo make install

export PATH="$PATH:/home/$USER/.openmpi/bin"
export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/home/$USER/.openmpi/lib/"

ahora lo guardamos para futuras secciones

echo export PATH="$PATH:/home/$USER/.openmpi/bin" >> /home/$USER/.bashrc
echo export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/home/$USER/.openmpi/lib/"
>> /home/$USER/.bashrc


Eso es todo
no muy complicado

miércoles, 18 de junio de 2014

Welcome


welcome



welcome to the blog of parallel computing, in this blog i will speak about the parallel computers and the parallel algorithms, but usually i will speak in spanish.

I will speak in spanish because i couldn't find book about this course in this idiom.

In this tutorial i will post videos where i will talk about parallel algorithms, MPI, OpenMP, Monte Carlo among other topics