id
int64 0
25.6k
| text
stringlengths 0
4.59k
|
---|---|
19,400 | python semaphores the sample program howeverstarts up five threadsthis therefore means that the first running threads will acquire the semaphore and the remaining thee will have to wait to acquire the semaphore once the first two release the semaphore further two can acquire it and so on from threading import threadsemaphorecurrentthread from time import sleep def worker(semaphore)with semaphoreprint(currentthread(getname(entered"sleep( print(currentthread(getname(exiting"print('mainthread starting'semaphore semaphore( for in range( )thread thread(name='tstr( )target=workerargs=[semaphore]thread start(print('mainthread done'the output from run of this program is given belowmainthread starting entered entered mainthread done exiting entered exiting entered exiting entered exiting exiting the concurrent queue class as might be expected the model where producer thread or process generates data to be processed by one or more consumer threads or processes is so common that higher level abstraction is provided in python than the use of locksconditions or semaphoresthis is the blocking queue model implemented by the threading queue or multiprocessing queue classes |
19,401 | inter thread/process synchronisation both these queue classes are thread and process safe that is they work appropriately (using internal locksto manage data access from concurrent threads or processes an example of using queue to exchange data between worker process and the main process is shown below the worker process executes the worker(function sleepingfor before putting string 'hello worldon the queue the main application function sets up the queue and creates the process the queue is passed into the process as one of its arguments the process is then started the main process then waits until data is available on the queue via the (blockingget(methods once the data is available it is retrieved and printed out before the main process terminates from multiprocessing import processqueue from time import sleep def worker(queue)print('worker going to sleep'sleep( print('worker woken up and putting data on queue'queue put('hello world'def main()print('main starting'queue queue( process(target=workerargs=[queue]print('main starting the process' start(print('main waiting for data'print(queue get()print('main done'if __name__ ='__main__'main(the output from this is shown belowmain starting main starting the process main wait for data worker going to sleep worker woken up and putting data on queue hello world main done howeverthis does not make it that clear how the execution of the two processes interweaves the following diagram illustrates this graphically |
19,402 | the concurrent queue class in the above diagram the main process waits for result to be returned from the queue following the call to the get(methodas it is waiting it is not using any system resources in turn the worker process sleeps for two seconds before putting some data onto the queue (via put('hello world')after this value is sent to the queue the value is returned to the main process which is woken up (moved out of the waiting stateand can continue to process the rest of the main function online resources see the following online resources for information discussed in this based barrierslocksconditionssemaphores and events process based barrierslocksconditionssemaphores and events exercises the aim of this exercise is to implement concurrent version of stack based container/collection it should be possible to safely add data to the stack and pop data off the stack using multiple threads |
19,403 | inter thread/process synchronisation it should follow similar pattern to the queue class described above but support the first in last out (filobehaviour of stack and be usable with any number of producer and consumer threads (you can ignore processes for this exercisethe key to implementing the stack is to remember that no data can be read from the stack until there is some data to accessit is therefore necessary to wait for data to become available and then to read it howeverit is producer thread that will provide that data and then inform any waiting threads that there is not data available you can implement this in any way you wishhowever common solution is to use condition to illustrate this ideathe following test program can be used to verify the behaviour of your stackfrom stack stack import stack from time import sleep from threading import thread def producer(stack)for in range( , )data 'taskstr(iprint('producer pushing:'datastack push(datasleep( def consumer(labelstack)while trueprint(label'stack pop():'stack pop()print('create shared stack'stack stack(print('stack:'stackprint('creating and starting consumer threads'consumer thread(target=consumerargs=('consumer 'stack)consumer thread(target=consumerargs=('consumer 'stack)consumer thread(target=consumerargs=('consumer 'stack)consumer start(consumer start(consumer start(print('creating and starting producer thread'producer thread(target=producerargs=[stack]producer start(the output generated from this sample program (which includes print statements from the stackis given below |
19,404 | exercises create shared stack stackstack[creating and starting consumer threads creating and starting producer thread producer pushingtask consumer stack pop()task producer pushingtask consumer stack pop()task producer pushingtask consumer stack pop()task producer pushingtask consumer stack pop()task producer pushingtask consumer stack pop()task producer pushingtask consumer stack pop()task |
19,405 | futures introduction future is thread (or processthat promises to return value in the futureonce the associated behaviour has completed it is thus future value it provides very simple way of firing off behaviour that will either be time consuming to execute or which may be delayed due to expensive operations such as input/output and which could slow down the execution of other elements of program this discusses futures in python the need for future in normal method or function invocationthe method or function is executed in line with the invoking code (the callerhaving to wait until the function or method (the calleereturns only after this is the caller able to continue to the next line of code and execute that in many (mostsituations this is exactly what you want as the next line of code may depend on result returned from the previous line of code etc howeverin some situations the next line of code is independent of the previous line of code for examplelet us assume that we are populating user interface (uithe first line of code may read the name of the user from some external data source (such as databaseand then display it within field in the ui the next line of code may then add todays data to another field in the ui these two lines of code are independent of each other and could be run concurrently/in parallel with each other in this situation we could use either thread or process to run the two lines of code independently of the callerthus achieving level of concurrency and allowing the caller to carry onto the third line of code etc (cspringer nature switzerland ag huntadvanced guide to python programmingundergraduate topics in computer science |
19,406 | futures howeverneither the thread or the process by default provide simple mechanism for obtaining result from such an independent operation this may not be problem as operations may be self-containedfor example they may obtain data from the database or from today' date and then updated ui howeverin many situations the calculation will return result which needs to be handled by the original invoking code (the callerthis could involve performing long running calculation and then using the result returned to generate another value or update another object etc future is an abstraction that simplifies the definition and execution of such concurrent tasks futures are available in many different languages including python but also javascalac+etc when using futurea callable object (such as functionis passed to the future which executes the behaviour either as separate thread or as separate process and then can return result once it is generated the result can either be handled by call back function (that is invoked when the result is availableor by using operation that will wait for result to be provided futures in python the concurrent futures library was introduced into python in version (and is also available in python onwardsthe concurrent futures library provides the future class and high level api for working with futures the concurrent futures future class encapsulates the asynchronous execution of callable object ( function or methodthe future class provides range of methods that can be used to obtain information about the state of the futureretrieve results or cancel the futurecancel(attempt to cancel the future if the future is currently being executed and cannot be cancelled then the method will return falseotherwise the call will be cancelled and the method will return true cancelled(returns true if the future was successfully cancelled running(returns true if the future is currently being executed and cannot be cancelled done(returns true if the future was successfully cancelled or finished running result(timeout=nonereturn the value returned by the future if the future hasn' yet completed then this method will wait up to timeout seconds if the call hasn' completed in timeout secondsthen timeouterror will be raised timeout can be an int or float if timeout is not specified or nonethere is no limit to the wait time if the future is cancelled before completing then the cancellederror will be raised if the call raisedthis method will raise the same exception |
19,407 | futures in python it should be noted howeverthat future instances should not be created directlyrather they should be created via the submit method of an appropriate executor future creation futures are created and executed by executors an executor provides two methods that can be used to execute future (or futuresand one to shut down the executor at the root of the executor class hierarchy is the concurrent futures executor abstract class it has two subclassesthe threadpoolexecutor and the processpoolexecutor the threadpoolexecutor uses threads to execute the futures while the processpoolexecutor uses separate processes you can therefore choose how you want the future to be executed by specifying one or other of these executors simple example future to illustrate these ideaswe will look at very simple example of using future to do this we will use simple worker functionsimilar to that used in the previous from time import sleep define function to be used with future def worker(msg)for in range( )print(msgend=''flush=truesleep( return the only difference with this version of worker is that it also returns result which is the number of times that the worker printed out the message we can of course invoke this method inline as followsres worker(' 'print(res |
19,408 | futures we can make the invocation of this method into future to do this we use threadpoolexecutor imported from the concurrent futures module we will then submit the worker function to the pool for execution this returns reference to future which we can use to obtain the resultfrom time import sleep from concurrent futures import threadpoolexecutor print('setting up the threadpoolexecutor'pool threadpoolexecutor( submit the function ot the pool to run concurrently obtain future from pool print('submitting the worker to the pool'future pool submit(worker' 'print('obtained reference to the future object'futureobtain the result from the future wait if necessary print('future result():'future result()print('done'the output from this issetting up the threadpoolexecutor submitting the worker to the pool aaobtained reference to the future object <future at ea state=runningaaaaaaaafuture result() done notice how the output from the main program and the worker is interwoven with two ' ' being printed out before the message starting 'obtained in this case new threadpoolexecutor is being created with one thread in the pool (typically there would be multiple threads in the pool but one is being used here for illustrative purposesthe submit(method is then used to submit the function worker with the parameter 'ato the threadpoolexecutor for it to schedule execution of the function the submit(method returns future object the main program then waits for the future object to return result (by calling the result(method on the futurethis method can also take timeout to change this example to use processes rather than threads all that is needed is to change the pool executor to processpoolexecutor |
19,409 | futures in python from concurrent futures import processpoolexecutor print('setting up the threadpoolexecutor'pool processpoolexecutor( print('submitting the worker to the pool'future pool submit(worker' 'print('obtained reference to the future object'future print('future result():'future result()print('done'the output from this program is very similar to the last onesetting up the threadpoolexecutor submitting the worker to the pool obtained reference to the future object <future at state=runningaaaaaaaaaafuture result() done the only difference is that in this particular run the message starting 'obtained is printed out before any of the ' ' are printedthis may be due to the fact that process initially takes longer to set up than thread running multiple futures both the threadpoolexecutor and the processpoolexecutor can be configured to support multiple threads/processes via the pool each task that is submitted to the pool will then run within separate thread/process if more tasks are submitted than there are threads/processes availablethen the submitted task will wait for the first available thread/process and then be executed this can act as way of managing the amount of concurrent work being done for examplein the following examplethe worker(function is submitted to the pool four timesbut the pool is configured to use threads thus the fourth worker will need to wait until one of the first three completes before it is able to executefrom concurrent futures import threadpoolexecutor print('starting 'pool threadpoolexecutor( future pool submit(worker' 'future pool submit(worker' 'future pool submit(worker' 'future pool submit(worker' 'print('\nfuture result():'future result()print('all done' |
19,410 | futures when this runs we can see that the futures for ab and all run concurrently but must wait until one of the others finishesstarting abcacbcabcbabcacbacabcbacabcbadddddddddd future result() all done the main thread also waits for future to finish as it requests the result which is blocking call that will only return once the future has completed and generates result againto use processes rather than threads all we need to do is to replace the threadpoolexecutor with the processpoolexecutorfrom concurrent futures import processpoolexecutor print('starting 'pool processpoolexecutor( future pool submit(worker' 'future pool submit(worker' 'future pool submit(worker' 'future pool submit(worker' 'print('\nfuture result():'future result()print('all done'waiting for all futures to complete it is possible to wait for all futures to complete before progressing in the previous section it was assumed that future would be the last future to completebut in many cases it may not be possible to know which future will be the last to complete in such situations it is very useful to be able to wait for all the futures to complete before continuing this can be done using the concurrent futures wait function this function takes collection of futures and optionally timeout and return_when indicator wait(fstimeout=nonereturn_when=all_completedwheretimeout can be used to control the maximum number of seconds to wait before returning timeout can be an int or float if timeout is not specified or nonethere is no limit to the wait time return_when indicates when this function should return it must be one of the following constantsfirst_completed the function will return when any future finishes or is cancelled |
19,411 | running multiple futures first_exception the function will return when any future finishes by raising an exception if no future raises an exceptionthen it is equivalent to all_completed all_completed the function will return when all futures finish or are cancelled the wait(function returns two sets done and not_done the first set contains the futures that completed (finished or were cancelledbefore the wait completed the second setthe not_donescontains uncompleted futures we can use the wait(function to modify out previous example so that we no longer rely on future finishing lastfrom concurrent futures import processpoolexecutor from concurrent futures import wait from time import sleep def worker(msg)for in range( , )print(msg,end=''flush=truesleep( return print('starting setting up pool'pool processpoolexecutor( futures [print('submitting futures'future pool submit(worker' 'futures append(future future pool submit(worker' 'futures append(future future pool submit(worker' 'futures append(future future pool submit(worker' 'futures append(future print('waiting for futures to complete'wait(futuresprint('\nall done'the output from this isstarting setting up pool submitting futures waiting for futures to complete abcabcabcabcabcabcbcacbacbabcadddddddddd all done note how each future is added to the list of futures which is then passed to the wait(function |
19,412 | futures processing results as completed what if we want to process each of the results returned by our collection of futureswe could loop through the futures list in the previous section once all the results have been generated howeverthis means that we would have to wait for them all to complete before processing the list in many situations we would like to process the results as soon as they are generated without being concerned if that is the firstthirdlast or second etc the concurrent futures as_completed(function does preciously thisit will serve up each future in turn as soon as they are completedwith all futures eventually being returned but without guaranteeing the order (just that as soon as future is finished generating result it will be immediately availablefor examplein the following examplethe is_even(function sleeps for random number of seconds (ensuring that different invocations of this function will take different durationsthen calculates resultfrom concurrent futures import threadpoolexecutoras_completed from time import sleep from random import randint def is_even( )print('checking if' 'is even'sleep(randint( )return str(nstr( = print('started'data [ pool threadpoolexecutor( futures [for in datafutures append(pool submit(is_evenv)for in as_completed(futures)print( result()print('done'the second for loop will loop through each future as they complete printing out the result from eachas shown below |
19,413 | running multiple futures started checking if is even checking if is even checking if is even checking if is even checking if is even checking if is even false true false false true true done as you can see from this output although the six futures were started in sequence the results returned are in different order (with the returned order being and finally processing future results using callback an alternative to the as_complete(approach is to provide function that will be called once result has been generated this has the advantage that the main program is never pausedit can continue doing whatever is required of it the function called once the result is generated is typically known as callback functionthat is the future calls back to this function when the result is available each future can have separate call back as the function to invoke is set on the future using the add_done_callback(method this method takes the name of the function to invoke for examplein this modified version of the previous examplewe specify call back function that will be used to print the futures result this call back function is called print_future_result(it takes the future that has completed as its argument |
19,414 | futures from concurrent futures import threadpoolexecutor from time import sleep from random import randint def is_even( )print('checking if' 'is even'sleep(randint( )return str(nstr( = def print_future_result(future)print('in callback future result'future result()print('started'data [ pool threadpoolexecutor( for in datafuture pool submit(is_evenvfuture add_done_callback(print_future_resultprint('done'when we run thiswe can see that the call back function is called after the main thread has completed againthe order is unspecified as the is_even(function still sleeps for random amount of time started checking if is even checking if is even checking if is even checking if is even checking if is even done in callback future resultchecking if is even in callback future resultin callback future resultin callback future resultin callback future resultin callback future result false false true false true true |
19,415 | online resources online resources see the following online resources for information on futureslibrary documentation on futures on futures exercises in mathematicsthe factorial of positive integer ndenoted by !is the product of all positive integers less than or equal to for example note that the value of is write future that will calculate the factorial of any number with the result being printed out via call back function there are several ways in which the factorial value can be calculated either using for loop or recursive function in either case sleep for millisecond between each calculation start multiple futures for different factorial values and see which comes back first |
19,416 | concurrency with asyncio introduction the async io facilities in python are relatively recent additions originally introduced in python and evolving up to and including python they are comprised (as of python of two new keywords async and await (introduced in python and the async io python package in this we first discuss asynchronous io before introducing the async and await keywords we then present async io taskshow they are created used and managed asynchronous io asynchronous io (or async iois language agnostic concurrent programming model (or paradigmthat has been implemented in several different programming language (such as cand scalaas well as in python asynchronous io is another way in which you can build concurrent applications in python it is in many ways an alternative to the facilities provided by the threading library in python howeverwere as the threading library is more susceptible to issues associated with the gil (the global interpreter lockwhich can affect performancethe async io facilities are better insulated from this issue the way in which async io operates is also lighter weight then the facilities provide day the multiprocessing library since the asynchronous tasks in async io run within single process rather than requiring separate processes to be spawned on the underlying hardware async io is therefore another alternative way of implementing concurrent solutions to problems it should be noted that it does not build on either threading or multi processinginstead async io is based on the idea of cooperative (cspringer nature switzerland ag huntadvanced guide to python programmingundergraduate topics in computer science |
19,417 | concurrency with asyncio multitasking these cooperating tasks operate asynchronouslyby this we mean that the tasksare able to operate separately from other tasksare able to wait for another task to return result when requiredand are thus able to allow other tasks to run while they are waiting the io (input/outputaspect of the name async io is because this form of concurrent program is best suited to / bound tasks in an / bound task program spends most of its time sending data toor reading data fromsome form of external device (for example database or set of files etc this communication is time consuming and means that the program spends most of its time waiting for response from the external device one way in which such / bound applications can (appear tospeed up is to overlap the execution of different tasksthuswhile one task is waiting for database to respond with some dataanother task can be writing data to log file etc async io event loop when you are developing code using the async io facilities you do not need to worry about how the internals of the async io library workhowever at least at the conceptual level it is useful to understand one key conceptthat of the async io event loopthis loop controls how and when each task gets run for the purposes of this discussion task represents some work that can be run independently of other pieces of work the event loop knows about each task to be run and what the state of the task currently is (for example whether it is waiting for something to happen/completeit selects task that is ready to run from the list of available tasks and executes it this task has complete control of the cpu until it either completes its work or hands back control to the event loop (for examplebecause it must now wait for some data to be supplied from databasethe event loop now checks to see if any of the waiting tasks are ready to continue executing and makes note of their status the event loop then selects another task that is ready to run and starts that task off this loop continues until all the tasks have finished this is illustrated below |
19,418 | async io event loop an important point to note in the above description is that task does not give up the processor unless it decides tofor example by having to wait for something else they never get interrupted in the middle of an operationthis avoids the problem that two threads might have when being time sliced by separate scheduler as they may both be sharing the same resource this can greatly simplify your code the async and await keywords the async keywordintroduced in python is used to mark function as being something that uses the await keyword (we will come back to this below as there is one other use of the async keyworda function that uses the await keyword can be run as separate task and can give up control of the processor when it calls await against another async function and must wait for that function to complete the invoked async function can then run as separate task etc to invoke an async function it is necessary to start the async io event loop and for that function to be treated as task by the event loop this is done by calling the asyncio run(method and passing in the root async function the asyncio run(function was introduced in python (older versions of python such as python required you to explicitly obtain reference to the event loop and to run the root async function via thatone point to note about this function is that it has been marked as being provisional in python this means that future versions of python may or may not support the function or may modify the function in some way you should therefore check the documentation for the version of python you are using to see whether the run method has been altered or not using async and await we will examine very simple async io program from the top down the main(function for the program is given belowdef main(print('main starting'asyncio run(do_something()print('main done'if __name__ ='__main__'main( |
19,419 | concurrency with asyncio the main(function is the entry point for the program and callsasyncio run(do_something()this starts the async io event loop running and results in the do_something(function being wrapped up in task that is managed by the loop note that you do not explicitly create task in async iothey are always created by some function however it is useful to be aware of tasks as you can interact with them to check their status or to retrieve result the do_something(function is marked with the keyword asyncasync def do_something()print('do_something will wait for worker'result await worker(print('do_something result:'resultas previously mentioned this indicates that it can be run as separate task and that it can use the keyword await to wait for some other function or behaviour to complete in this case the do_something(asynchronous function must wait for the worker(function to complete the await keyword does more than merely indicate that the do_something (function must wait for the worker to complete it triggers another task to be created that will execute the worker(function and releases the processor allowing the event loop to select the next task to execute (which may or may not be the task running the worker(functionthe status of the do_something task is now waiting while the status of the worker(task is ready (to runthe code for the worker task is given belowasync def worker()print('worker will take some time'time sleep( print('worker done it'return the async keyword again indicates that this function can be run as separate task howeverthis time the body of the function does not use the await keyword this is because this is special case known as an async io coroutine function this is function that returns value from task (it is related to the idea of standard python coroutine which is data consumersadlycomputer science has many examples where the same term has been used for different things as well as examples where different terms have been used for the same thing in this case to avoid confusion just stick with async io coroutines are functions marked with async that can be run as separate task and may call await |
19,420 | the async and await keywords the full listing for the program is given belowimport asyncio import time async def worker()print('worker will take some time'time sleep( print('worker done it'return async def do_something()print('do_something will wait for worker'result await worker(print('do_something result:'resultdef main()print('main starting'asyncio run(do_something()print('main done'if __name__ ='__main__'main(when this program is executed the output ismain starting do_something will wait for worker worker will take some time worker done it do_something result main done when this is run there is pause between the two worker printouts as it sleeps although it is not completely obvious herethe do_something(function was run as one taskthis task then waited when it got to the worker(function which was run as another task once the worker task completed the do_something task could continue and complete its operation once this happened the async io event loop could then terminate as no further tasks were available async io tasks tasks are used to execute functions marked with the async keyword concurrently tasks are never created directly instead they are created implicitly via the keyword await or through functions such as asyncio run described above or |
19,421 | concurrency with asyncio asyncio create_task()asyncio gather(and asyncio as_completed(these additional task creation functions are described belowasyncio create_task(this function takes function marked with async and wraps it inside task and schedules it for execution by the async io event loop this function was added in python asyncio gather(*awsthis function runs all the async functions passed to it as separate tasks it gathers the results of each separate task together and returns them as list the order of the results corresponds to the order of the async functions in the aws list asyncio as_completed(awsruns each of the async functions passed to it task object supports several useful methods cancel(cancels running task calling this method will cause the task to throw cancellederror exception cancelled(returns true if the task has been cancelled done(returns true if the task has completedraised an exception or was cancelled result(returns the result of the task if it is done if the tasks result is not yet availablethen the method raises the invalidstateerror exception exception(return an exception if one was raised by the task if the task was cancelled then raises the cancellederror exception if the task is not yet donethen raises an invalidstateerror exception it is also possible to add callback function to invoke once the task has completed (or to remove such function if it has been added)add_done_callback(callbackadd callback to be run when the task is done remove_done_callback(callbackremove callback from the callbacks list note that the method is called 'addrather than 'setimplying that there can be multiple functions called when the task has completed (if requiredthe following example illustrates some of the aboveimport asyncio async def worker()print('worker will take some time'await asyncio sleep( print('worker done it'return def print_it(task)print('print_it result:'task result() |
19,422 | async io tasks async def do_something()print('do_something create task for worker'task asyncio create_task(worker()print('do_something add callback'task add_done_callback(print_itawait task information on task print('do_something task cancelled():'task cancelled()print('do_something task done():'task done()print('do_something task result():'task result()print('do_something task exception():'task exception()print('do_something finished'def main(print('main starting'asyncio run(do_something()print('main done'if __name__ ='__main__'main(in this examplethe worker(function is wrapped within task object that is returned from the asyncio create_task(worker()call function (print_it()is registered as callback on the task using the asyncio create_task(worker()function note that the worker is passed the task that has completed as parameter this allows it to obtain information from the task such as any result generated in this example the async function do_something(explicitly waits on the task to complete once this happens several different methods are used to obtain information about the task (such as whether it was cancelled or notone other point to note about this listing is that in the worker(function we have added an await using the asyncio sleep( functionthis allows the worker to sleep and wait for the triggered task to completeit is an async io alternative to time sleep( the output from this program ismain starting do_something create task for worker do_something add callback worker will take some time worker done it print_it result do_something task cancelled()false do_something task done()true do_something task result() do_something task exception()none do_something finished main done |
19,423 | concurrency with asyncio running multiple tasks in many cases it is useful to be able to run several tasks concurrently there are two options provided for this the asyncio gather(and the asyncio as_completed(functionwe will look at both in this section collating results from multiple tasks it is often useful to collect all the results from set of tasks together and to continue only once all the results have been obtained when using threads or processes this can be achieved by starting multiple threads or processes and then using some other object such as barrier to wait for all the results to be available before continuing within the async io library all that is required is to use the asyncio gather(function with list of the async functions to runfor exampleimport asyncio import random async def worker()print('worker will take some time'await asyncio sleep( result random randint( , print('worker done it'return result async def do_something()print('do_something will wait for worker'run three calls to worker concurrently and collect results results await asyncio gather(worker()worker()worker()print('results from calls:'resultsdef main(print('main starting'asyncio run(do_something()print('main done'if __name__ ='__main__'main(in this program the do_something(function uses results await asyncio gather(worker()worker()worker()to run three invocations of the worker(function in three separate tasks and to wait for the results of all three to be made available before they are returned as list of values and stored in the results variable |
19,424 | running multiple tasks this makes is very easy to work with multiple concurrent tasks and to collate their results note that in this code example the worker async function returns random number between and the output from this program ismain starting do_something will wait for worker worker will take some time worker will take some time worker will take some time worker done it worker done it worker done it results from calls[ main done as you can see from this all three of the worker invocations are started but then release the processor while they sleep after this the three tasks wake up and complete before the results are collected together and printed out handling task results as they are made available another option when running multiple tasks is to handle the results as they become availablerather than wait for all the results to be provided before continuing this option is supported by the asyncio as_completed(function this function returns an iterator of async functions which will be served up as soon as they have completed their work the for-loop construct can be used with the iterator returned by the functionhowever within the for loop the code must call await on the async functions returned so that the result of the task can be obtained for exampleasync def do_something()print('do_something will wait for worker'run three calls to worker concurrently and collect results for async_func in asyncio as_completed((worker(' ')worker(' ')worker(' ')))result await async_func print('do_something result:'resultnote that the asyncio as_completed(function takes container such as tuple of async functions |
19,425 | concurrency with asyncio we have also modified the worker function slightly so that label is added to the random number generated so that it is clear which invocation of the worker function return which resultasync def worker(label)print('worker will take some time'await asyncio sleep( result random randint( , print('worker done it'return label str(resultwhen we run this program def main(print('main starting'asyncio run(do_something()print('main done'the output is main starting do_something will wait for worker worker will take some time worker will take some time worker will take some time worker done it worker done it worker done it do_something resultc do_something resulta do_something resultb main done as you can see from thisthe results are not returned in the order that the tasks are createdtask 'ccompletes first followed by 'aand 'bthis illustrates the behaviour of the asyncio as_completed(function online resources see the following online resources for information on futuresdocumentation on asyncio asyncio asyncio tutorial |
19,426 | exercises exercises this exercise will use the facilities in the asyncio library to calculate set of factorial numbers the factorial of positive integer is the product of all positive integers less than or equal to for example note that the value of is create an application that will use the async and await keywords to calculate the factorials of set of numbers the factorial function should await for of second (using asyncio sleep( )each time round the loop used to calculate the factorial of number you can use with asyncio as_completed(or asyncio gather(to collect the results up you might also use list comprehension to create the list of calls to the factorial function the main function might look likedef main()print('main starting'asyncio run(calculate_factorials([ ])print('main done'if __name__ ='__main__'main( |
19,427 | reactive programming |
19,428 | reactive programming introduction introduction in this we will introduce the concept of reactive programming reactive programming is way of write programs that allow the system to reactive to data being published to it we will look at the rxpy library which provides python implementation of the reactivex approach to reactive programming what is reactive applicationa reactive application is one that must react to datatypically either to the presence of new dataor to changes in existing data the reactive manifesto presents the key characteristics of reactive systems asresponsive this means that such systems respond in timely manner here of course timely will differ depending upon the application and domainin one situation second may be timely in another it may be far too slow resilient such systems stay responsive in the face of failure the systems must therefore be designed to handle failure gracefully and continue to work appropriately following the failure elastic as the workload grows the system should continue to be responsive message driven information is exchanged between elements of reactive system using messages this ensures loose couplingisolation and location transparency between these components as an exampleconsider an application that lists set of equity stock trade values based on the latest market stick price data this application might present the current value of each trade within table when new market stock price data is (cspringer nature switzerland ag huntadvanced guide to python programmingundergraduate topics in computer science |
19,429 | reactive programming introduction publishedthen the application must update the value of the trade within the table such an application can be described as being reactive reactive programming is programming style (typically supported by librariesthat allows code to be written that follow the ideas of reactive systems of course just because part of an application uses reactive programming library does not make the whole application reactiveindeed it may only be necessary for part of an application to exhibit reactive behaviour the reactivex project reactivex is the best known implementation of the reactive programming paradigm reactivex is based on the observer-observable design pattern however it is an extension to this design pattern as it extends the pattern such that the approach supports sequences of data and/or events and adds operators that allow developers to compose sequences together declaratively while abstracting away concerns associated with low-level threadssynchronisationconcurrent data structures and non-blocking / the reactivex project has implementations for many languages including rxjavarxscala and rxpythis last is the version we are looking at as it is for the python language rxpy is described asa library for composing asynchronous and event-based programs using observable collections and query operator functions in python the observer pattern the observer pattern is one of the gang of four set of design patterns the gang of four patterns (as originally described in gamma et al are so called because this book on design patterns was written by four very famous authors namelyerich gammarichard helmralph johnson and john vlissides the observer pattern provides way of ensuring that set of objects is notified whenever the state of another object changes it has been widely used in number of languages (such as smalltalk and javaand can also be used with python the intent of the observer pattern is to manage one to many relationship between an object and those objects interested in the stateand in particular state changesof that object thus when the objectsstate changesthe interested (dependentobjects are notified of that change and can take whatever action is appropriate |
19,430 | the observer pattern there are two key roles within the observer patternthese are the observable and the observer roles observable this is the object that is responsible for notifying other objects that change in its state has occurred observer an observer is an object that will be notified of the change in state of the observable and can take appropriate action (such as triggering change in their own state or performing some actionin addition the state is typically represented explicitlystate this role may be played by an object that is used to share information about the change in state that has occurred within the observable this might be as simple as string indicating the new state of the observable or it might be data oriented object that provides more detailed information these roles are illustrated in the following figure in the above figurethe observable object publishes data to data stream the data in the data stream is then sent to each of the observers registered with the observable in this way data is broadcast to all observers of an observable it is common for an observable to only publish data once there is an observer available to process that data the process of registering with an observable is referred to as subscribing thus an observable will have zero or more subscribers (observersif the observable publishes data at faster rate than can be processed by the observer then the data is queued via the data stream this allows the observer to process the data received one at time at its own pacewithout any concern for data loss (as long as sufficient memory is available for the data stream hot and cold observables another concept that it is useful to understand is that of hot and cold observables cold observables are lazy observables that isa cold observable will only publish data if at least one observer is subscribed to it |
19,431 | reactive programming introduction hot observablesby contrastpublish data whether there is an observer subscribed or not cold observables cold observable will not publish any data unless there is at least one observer subscribed to process that data in addition cold observable only provides data to an observer when that observer is ready to process the datathis is because the observable-observer relationship is more of pull relationship for examplegiven an observable that will generate set of values based on rangethen that observable will generate each result lazily when requested by an observer if the observer takes some time to process the data emitted by the observablethen the observable will wait until the observer is ready to process the data before emitting another value hot observables hot observables by contrast publish data whether there is an observer subscribed or not when an observer registers with the observableit will start to receive data at that pointas and when the observable publishes new data if the observable has already published previous data itemsthen these will have been lost and the observer will not receive that data the most common situation in which hot observable is created is when the source producer represents data that may be irrelevant if not processed immediately or may be superseded by subsequent data for exampledata published by stock market price data feed would fall into this category when an observable wraps around this data feed it can publish that data whether or not an observer is subscribed implications of hot and cold observables it is important to know whether you have hot or cold observable because this can impact on what you can assume about the data supplied to the observers and thus how you need to design your application if it is important that no data is lost then care is needed to ensure that the subscribers are in place before hot observable starts to publish data (where as this is not concern for cold observable |
19,432 | differences between event driven programming and reactive programming differences between event driven programming and reactive programming in event driven programmingan event is generated in response too something happeningthe event then represents this with any associated data for exampleif the user clicks the mouse then an associated mouseclickevent might be generated this object will usually hold information about the and coordinates of the mouse along with which button was clicked etc it is then possible to associate some behaviour (such as function or methodwith this event so that if the event occursthen the associated operation is invoked and the event object is provided as parameter this is certainly the approach used in the wxpython library presented earlier in this bookfrom the above diagramwhen moveevent is generated the on_move(method is called and the event is passed into the method in the reactive programming approachan observer is associated with an observable any data generated by the observable will be received and handled by the observer this is true whatever that data isas the observer is handler of data generated by the observable rather than handler of specific type of data (as with the event driven approachboth approaches could be used in many situations for examplewe could have scenario in which some data is to be processed whenever stock price changes this could be implemented using stockpricechangeevent associated with stockpriceeventhandler it could also be implemented via stock pricechangeobserverable and stockpricechangeobserver in either case one element handles the data generated by another element howeverthe rxpy library simplifies this process and allows the observer to run in the same thread asor separate thread fromthe observable with just small change to the code advantages of reactive programming there are several advantages to the use of reactive programming library these includeit avoids multiple callback methods the problems associated with the use of callbacks are sometimes referred to as callback hell this can occur when there are multiple callbacksall defined to run in response to some data being generated or some operation completing it can be hard to understandmaintain and debug such systems |
19,433 | reactive programming introduction simpler asynchronousmulti threaded execution the approach adopted by rxpy makes it very easy to execute operationsbehaviour within multi threaded environment with independent asynchronous functions available operators the rxpy library comes pre built with numerous operators that make processing the data produced by an observable much easier data composition it is straight forward to compose new data streams (observablesfrom data supplied by two or more other observables for asynchronous processing disadvantages of reactive programming its easy to over complicate things when you start to chain operators together if you use too many operatorsor too complex set of functions with the operatorsit can become hard to understand what is going on many developers think that reactive programming is inherently multi-threadedthis is not necessarily the casein fact rxpy (the library explored in the next two is single threaded by default if an application needs the behaviour to execute asynchronously then it is necessary to explicitly indicate this another issue for some reactive programming frameworks is that it can become memory intensive to store streams of data so that observers can processes that data when they are ready the rxpy reactive programming framework the rxpy library is part of the larger reactivex project and provides an implementation of reactivex for python it is built on the concepts of observablesobserverssubjects and operators in this book we use rxpy version in the next we will discuss observablesobserverssubjects and subscriptions using the rxpy library the following will explore various rxpy operators online resources see the following online resources for information on reactive programmingpatterns book |
19,434 | reference reference for more information on the observer observable design pattern see the "patternsbook by the gang of four gammar helmr johnsonj vlissadesdesign patternselements of reusable object-oriented softwareaddison-wesley ( |
19,435 | rxpy observablesobservers and subjects introduction in this we will discuss observablesobservers and subjects we also consider how observers may or may not run concurrently in the remainder of this we look at rxpy version which is major update from rxpy version (you will therefore need to be careful if you are looking on the web for examples as some aspects have changedmost notably the way in which operators are chained observables in rxpy an observable is python class that publishes data so that it can be processed by one or more observers (potentially running in separate threadsan observable can be created to publish data from static data or from dynamic sources observables can be chained tougher to control how and when data is publishedto transform data before it is published and to restrict what data is actually published for exampleto create an observable from list of values we can use the rx from_list(function this function (also known as an rxpy operatoris used to create the new observable objectimport rx observable rx from_list([ ](cspringer nature switzerland ag huntadvanced guide to python programmingundergraduate topics in computer science |
19,436 | rxpy observablesobservers and subjects observers in rxpy we can add an observer to an observable using the subcribe(method this method can be supplied with lambda functiona named function or an object whose class implements the observer protocol for examplethe simplest way to create an observer is to use lambda functionsubscribe lambda function observable subscribe(lambda valueprint('lambda received'value)when the observable publishes data the lambda function will be invoked each data item published will be supplied independently to the function the output from the above subscription for the previous observable islambda received lambda received lambda received lambda received we can also have used standard or named function as an observerdef prime_number_reporter(value)print('function received'valuesubscribe named function observable subscribe(prime_number_reporternote that it is only the name of the function that is used with the subscribe(method (as this effectively passes reference to the function into the methodif we now run this code using the previous observable we getfunction received function received function received function received in actual fact the subscribe(method takes four optional parameters thes areon_next action to invoke for each data item generated by the observable on_error action to invoke upon exceptional termination of the observable sequence on_completed action to invoke upon graceful termination of the observable sequence observer the object that is to receive notifications you may subscribe using an observer or callbacksnot both |
19,437 | observers in rxpy each of the above can be used as positional parameters or as keyword argumentsfor exampleuse lambdas to set up all three functions observable subscribeon_next lambda valueprint('received on_next'value)on_error lambda expprint('error occurred'exp)on_completed lambdaprint('received completed notification'the above code defines three lambda functions that will be called depending upon whether data is supplied by the observableif an error occurs or when the data stream is terminated the output from this isreceived on_next received on_next received on_next received on_next received completed notification note that the on_error function is not run as no error was generated in this example the final optional parameter to the subscribe(method is an observer object an observer object can implement the observer protocol which has the following methods on_next()on_completed(and on_error()for exampleclass primenumberobserverdef on_next(selfvalue)print('object received'valuedef on_completed(self)print('data stream completed'def on_error(selferror)print('error occurred'errorinstances of this class can now be used as an observer via the subscribe(methodsubscribe an observer object observable subscribe(primenumberobserver()the output from this example using the previous observable isobject received object received object received object received data stream completed note that the on_completed(method is also calledhowever the on_errror(method is not called as there were no exceptions generated |
19,438 | rxpy observablesobservers and subjects the observer class must ensure that the methods implemented adhere to the observer protocol ( that the signatures of the on_next()on_completed (and on_error(methods are correct multiple subscribers/observers an observable can have multiple observers subscribed to it in this case each of the observers is sent all of the data published by the observable multiple observers can be registered with an observable by calling the subscribe method multiple times for examplethe following program has four subscribers as well as on_error and on_completed function registeredcreate an observable using data in list observable rx from_list([ ]class primenumberobserver""an observer class ""def on_next(selfvalue)print('object received'valuedef on_completed(self)print('data stream completed'def on_error(selferror)print('error occurred'errordef prime_number_reporter(value)print('function received'valueprint('set up observers subscribers'subscribe lambda function observable subscribe(lambda valueprint('lambda received'value)subscribe named function observable subscribe(prime_number_reportersubscribe an observer object observable subscribe(primenumberobserver()use lambdas to set up all three functions observable subscribeon_next=lambda valueprint('received on_next'value)on_error=lambda expprint('error occurred'exp)on_completed=lambdaprint('received completed notification' |
19,439 | multiple subscribers/observers the output from this program iscreate the observable object set up observers subscribers lambda received lambda received lambda received lambda received function received function received function received function received object received object received object received object received data stream completed received on_next received on_next received on_next received on_next received completed notification note how each of the subscribers is sent all of the data before the next subscriber is sent their data (this is the default single threaded rxpy behaviour subjects in rxpy subject is both an observer and an observable this allows subject to receive an item of data and then to republish that data or data derived from it for exampleimagine subject that receives stock market price data published by an external (to the organisation receiving the datasource this subject might add timestamp and source location to the data before republishing it to other internal observers howeverthere is subtle difference that should be noted between subject and plain observable subscription to an observable will cause an independent execution of the observable when data is published notice how in the previous section all the messages were sent to specific observer before the next observer was sent any data at all howevera subject shares the publication action with all of the subscribers and they will therefore all receive the same data item in chain before the next data item in the class hierarchy the subject class is direct subclass of the observer class |
19,440 | rxpy observablesobservers and subjects the following example creates subject that enriches the data it receives by adding timestamp to each data item it then republishes the data item to any observers that have subscribed to it import rx from rx subjects import subject from datetime import datetime source rx from_list([ ]class timestampsubject(subject)def on_next(selfvalue)print('subject received'valuesuper(on_next((valuedatetime now())def on_completed(self)print('data stream completed'super(on_completed(def on_error(selferror)print('in subject error occurred'errorsuper(on_error(errordef prime_number_reporter(value)print('function received'valueprint('set up'create the subject subject timestampsubject(set up multiple subscribers for the subject subject subscribe(prime_number_reportersubject subscribe(lambda valueprint('lambda received'value)subject subscribeon_next lambda valueprint('received on_next'value)on_error lambda expprint('error occurred'exp)on_completed lambdaprint('received completed notification'subscribe the subject to the observable source source subscribe(subjectprint('done'note that in the above program the observers are added to the subject before the subject is added to the source observable this ensures that the observers are subscribed before the subject starts to receive data published by the |
19,441 | subjects in rxpy observable if the subject was subscribed to the observable before the observers were subscribed to the subjectthen all the data could have been published before the observers were registered with the subject the output from this program isset up subject received function received ( datetime datetime( )lambda received ( datetime datetime( )received on_next ( datetime datetime( )subject received function received ( datetime datetime( )lambda received ( datetime datetime( )received on_next ( datetime datetime( )subject received function received ( datetime datetime( )lambda received ( datetime datetime( )received on_next ( datetime datetime( )subject received function received ( datetime datetime( )lambda received ( datetime datetime( )received on_next ( datetime datetime( )data stream completed received completed notification done as can be seen from this output the numbers and are received by all of the observers once the subject has added the timestamp observer concurrency by default rxpy uses single threaded modelthat is observables and observers execute in the same thread of execution howeverthis is only the default as it is the simplest approach it is possible to indicate that when observer subscribes to an observable that it should run in separate thread using the scheduler keyword parameter on the |
19,442 | rxpy observablesobservers and subjects subscribe(method this keyword is given an appropriate scheduler such as the rx concurrency newthreadscheduler this scheduler will ensure that the observer runs in separate thread to see the difference look at the following two programs the main difference between the programs is the use of specific schedulersimport rx observable rx from_list([ ]observable subscribe(lambda vprint('lambda received' )observable subscribe(lambda vprint('lambda received' )observable subscribe(lambda vprint('lambda received' )the output from this first version is given belowlambda received lambda received lambda received lambda received lambda received lambda received lambda received lambda received lambda received the subscribe(method takes an optional keyword parameter called scheduler that allows scheduler object to be provided now if we specify few different schedulers we will see that the effect is to run the observers concurrently with the resulting output being interwovenimport rx from rx concurrency import newthreadschedulerthreadpoolschedulerimmediatescheduler observable rx from_list([ ]observable subscribe(lambda vprint('lambda received' )scheduler=threadpoolscheduler( )observable subscribe(lambda vprint('lambda received' )scheduler=immediatescheduler()observable subscribe(lambda vprint('lambda received' )scheduler=newthreadscheduler()as the observable runs in separate thread need ensure that the main thread does not terminate input('press enter to finish' |
19,443 | observer concurrency note that we have to ensure that the main thread running the program does not terminate (as all the observables are now running in their own threadsby waiting for user input the output from this version islambda received lambda received lambda received lambda received lambda received lambda received press enter to finish lambda received lambda received lambda received by default the scheduler keyword on the subscribe(method defaults to none indicating that the current thread will be used for the subscription to the observable available schedulers to support different scheduling strategies the rxpy library provides two modules that supply different schedulersthe rx concurrency and rx currency mainloopscheduler the modules contain variety of schedulers including those listed below the following schedulers are available in the rx concurrency moduleimmediatescheduler this schedules an action for immediate execution currentthreadscheduler this schedules activity for the current thread timeoutscheduler this scheduler works via timed callback newthreadscheduler creates scheduler for each unit of work on separate thread threadpoolscheduler this is scheduler that utilises thread pool to execute work this scheduler can act as way of throttling the amount of work carried out concurrently the rx concurrency mainloopschduler module also defines the following schedulersioloopscheduler scheduler that schedules work via the tornado / main event loop pygamescheduler scheduler that schedules works for pygame wxscheduler scheduler for wxpython event loop |
19,444 | rxpy observablesobservers and subjects online resources see the following online resources for information on rxpyoperators exercises given the following set of tuples representing stock/equity pricesstocks (('appl' )('ibm' )('msft' )('appl' )write program that will create an observable based on the stocks data next subscribe three different observers to the observable the first should print out the stock pricethe second should print out the name of the stock and the third should print out the entire tuple |
19,445 | rxpy operators introduction in this we will look at the types of operators provided by rxpy that can be applied to the data emitted by an observable reactive programming operators behind the interaction between an observable and an observer is data stream that is the observable supplies data stream to an observer that consumesprocesses that stream it is possible to apply an operator to this data stream that can be used to to filtertransform and generally refine how and when the data is supplied to the observer the operators are mostly defined in the rx operators modulefor example rx operators average(however it is common to use an alias for this such that the operators module is called opsuch as from rx import operators as op this allows for short hand form to be used when referencing an operatorsuch as op average(many of the rxpy operators execute function which is applied to each of the data items produced by an observable others can be used to create an initial observable (indeed you have already seen these operators in the form of the from_list(operatoranother set of operators can be used to generate result based on data produced by the observable (such as the sum(operator(cspringer nature switzerland ag huntadvanced guide to python programmingundergraduate topics in computer science |
19,446 | rxpy operators in fact rxpy provides wide variety of operators and these operators can be categorised as followscreationaltransformationalcombinatorialfilterserror handlersconditional and boolean operatorsmathematicalconnectable examples of some of these categories are presented in the rest of this section piping operators to apply an operator other than creational operator to an observable it is necessary to create pipe pipe is essentially series of one or more operations that can be applied to the data stream generated by the observable the result of applying the pipe is that new data stream is generated that represents the results produced following the application of each operator in turn this is illustrated belowto create pipe the observable pipe(method is used this method takes comma delimited list of one or more operators and returns data stream observers can then subscribe to the pipe' data stream this can be seen in the examples given in the rest of this for transformationsfiltersmathematical operators etc |
19,447 | creational operators creational operators you have already seen an example of creational operator in the examples presented earlier in this this is because the rx from_list(operator is an example of creational operator it is used to create new observable based on data held in list like structure more generic version of from_list(is the from_(operator this operator takes an iterable and generates an observable based on the data provided by the iterable any object that implements the iterable protocol can be used including user defined types there is also an operator from_iterable(all three operators do the same thing and you can choose which to use based on which provides the most semantic meaning in your context all three of the following statements have the same effectsource rx from_([ ]source rx from_iterable([ ]source rx from_list([ ]this is illustrated pictorially belowanother creational operator is the rx range(operator this operator generates an observable for range of integer numbers the range can be specified with our without starting value and with or within an increment however the maximum value in the range must always be providedfor exampleobs rx range( obs rx range( obs rx range( transformational operators there are several transformational operators defined in the rx operators module including rx operators map(and rx operators flat_map(the rx operators map(operator applies function to each data item generated by an observable |
19,448 | rxpy operators the rx operators flat_map(operator also applies function to each data item but then applies flatten operation to the result for exampleif the result is list of lists then flat_map will flatten this into single list in this section we will focus on the rx operators map(operator the rx operators map(operator allows function to be applied to all data items generated by an observable the result of this function is then returned as the result of the map(operators observable the function is typically used to perform some form of transformation to the data supplied to it this could be adding one to all integer valuesconverting the format of the data from xml to jsonenriching the data with additional information such as the time the data was acquired and who the data was supplied by etc in the example given below we are transforming the set of integer values supplied by the original observable into strings in the diagram these strings include quotes around them to highlight they are in fact stringthis is typical of the use of transformation operatorthat is to change the data from one format to another or to add information to the data the code used to implement this scenario is given below note the use of the pipe(method to apply the operator to the data stream generated by the observableapply transformation to data source to convert integers into strings import rx from rx import operators as op set up source with map function source rx from_list([ ]pipeop map(lambda value"'str(value"'"subscribe lambda function source subscribe(lambda valueprint('lambda received'valueis string 'isinstance(valuestr)) |
19,449 | transformational operators the output from this program islambda received ' lambda received ' lambda received ' lambda received ' is string is string is string is string true true true true combinatorial operators combinatorial operators combine together multiple data items in some way one example of combinatorial operator is the rx merge(operator this operator merges the data produced by two observables into single observable data stream for examplein the above diagram two observables are represented by the sequence and the sequence these observables are supplied to the merge operator that generates single observable that will supply data generated from both of the original observables this is an example of an operator that does not take function but instead takes two observables the code representing the above scenario is given belowan example illustrating how to merge two data sources import rx set up two sources source rx from_list([ ]source rx from_list([ ]merge two sources into one rx merge(source source )subscribe(lambda vprint(vend=',')notice that in this case we have subscribed directly to the observable returned by the merge(operator and have not stored this in an intermediate variable (this was design decision and either approach is acceptable |
19,450 | rxpy operators the output from this program is presented below , , , , , , notice from the output the way in which the data held in the original observables is intertwined in the output of the observable generated by the merge(operator filtering operators there are several operators in this category including rx operators filter ()rx operators first()rx operators last(and rx operators distinct(the filter(operator only allows those data items to pass through that pass some test expression defined by the function passed into the filter this function must return true or false any data item that causes the function to return true is allowed to pass through the filter for examplelet us assume that the function passed into filter(is designed to only allow even numbers through if the data stream contains the numbers and then the filter(will only emit the numbers and this is illustrated belowthe following code implements the above scenariofilter source for even numbers import rx from rx import operators as op set up source with filter source rx from_list([ ]pipeop filter(lambda valuevalue = subscribe lambda function source subscribe(lambda valueprint('lambda received'value)in the above code the rx operators filter(operator takes lambda function that will verify if the current value is even or not (note this could have been named function or method on an object etc it is applied to the data stream generated by the observable using the pipe(method the output generated by this example is |
19,451 | filtering operators lambda received lambda received lambda received the first(and last(operators emit only the first and last data item published by the observable the distinct(operator suppresses duplicate items being published by the observable for examplein the following list used as the data for the observablethe numbers and are duplicateduse distinct to suppress duplicates source rx from_list([ ]pipeop distinct(subscribe lambda function source subscribe(lambda valueprint('received'value)howeverwhen the output is generated by the program all duplicates have been suppressedreceived received received received mathematical operators mathematical and aggregate operators perform calculations on the data stream provided by an observable for examplethe rx operators average(operator can be used to calculate the average of set of numbers published by an observable similarly rx operators max(can select the maximum valuerx operators min(the minimum value and rx operators sum(will total all the numbers published etc an example using the rx operators sum(operator is given blowexample of summing all the values in data stream import rx from rx import operators as op set up source and apply sum rx from_list([ ]pipeop sum(subscribe(lambda vprint( ) |
19,452 | rxpy operators the output from the rx operators sum(operator is the total of the data items published by the observable (in this case the total of and the observer function that is subscribed to the rx operators sum(operators observable will print out this valuehoweverin some cases it may be useful to be notified of the intermediate running total as well as the final value so that other operators down the chain can react to these subtotals this can be achieved using the rx operators scan(operator the rx operators scan(operator is actually transformational operator but can be used in this case to provide mathematical operation the scan(operator applies function to each data item published by an observable and generates its own data item for each value received each generated value is passed to the next invocation of the scan(function as well as being published to the scan(operators observable data stream the running total can thus be generated from the previous sub total and the new value obtained this is shown belowimport rx from rx import operators as op rolling or incremental sum rx from_([ ]pipeop scan(lambda subtotalisubtotal+isubscribe(lambda vprint( )the output from this example is this means that each subtotal is published as well as the final total chaining operators an interesting aspect of the rxpy approach to data stream processing is that it is possible to apply multiple operators to the data stream produced by an observable the operators discussed earlier actually return another observable this new observable can supply its own data stream based on the original data stream and the result of applying the operator this allows another operator to be applied in sequence to the data produced by the new observable this allows the operators to be chained together to provide sophisticated processing of the data published by the original observable |
19,453 | chaining operators for examplewe might first start off by filtering the output from an observable such that only certain data items are published we might then apply transformation in the form of map(operator to that dataas shown belownote the the order in which we have applied the operatorswe first filter out data that is not of interest and then apply the transformation this is more efficient than apply the operators the other way around as in the above example we do not need to transform the odd values it is therefore common to try and push the filter operators as high up the chain as possible the code used to generate the chained set of operators is given below in this case we have used lambda functions to define the filter(function and the map (function the operators are applied to the observable obtained from the list supplied the data stream generated by the observable is processed by each of the operators defined in the pipe as there are now two operators the pipe contains both operators and acts pipe down which the data flows the list used as the initial source of the observables data contains sequence of event and odd numbers the filter(function selects only even numbers and the map(function transforms the integer values into strings we then subscribe an observer function to the observable produced by the transformational map(operator example of chaining operators together import rx from rx import operators as op set up source with filter source rx from_list([ ]pipe source pipeop filter(lambda valuevalue = )op map(lambda value"'str(value"'"subscribe lambda function pipe subscribe(lambda valueprint('received'value) |
19,454 | rxpy operators the output from this application is given belowreceived ' received ' received ' this makes it clear that only the three even numbers ( and are allowed through to the map(function online resources see the following online resources for information on rxpyoperators exercises given the following set of tuples representing stock/equity pricesstocks (('appl' )('ibm' )('msft' )('appl' )provide solutions to the followingselect all the 'applstocks select all stocks with price over find the average price of all 'applstocks now use the second set of tuples and merge them with the first set of stock pricesstocks (('goog' )('appl' )('appl' )('msft' )('goog' )('ibm' )convert each tuple into list and calculate how much shares in that stock would beprint this out as the resultfind the highest value stock find the lowest value stock only publish unique data times ( suppress duplicates |
19,455 | network programming |
19,456 | introduction to sockets and web services introduction in the following two we will explore socket based and web service approaches to inter process communications these processes may be running on the same computer or different computers on the same local area network or may be geographically far apart in all cases information is sent by one program running in one process to another program running in separate process via internet sockets this introduces the core concepts involved in network programming sockets socketsor rather internet protocol (ipsockets provide programming interface to the network protocol stack that is managed by the underlying operating system using such an api means that the programmer is abstracted away from the low level details of how data is exchanged between process on (potentiallydifferent computers and can instead focus on the higher level aspects of their solution there are number of different types of ip socket availablehowever the focus in this book is on stream sockets stream socket uses the transmission control protocol (tcpto send messages such socket is often referred to as tcp/ip socket tcp provides for ordered and reliable transmission of data across the connection between two devices (or hoststhis can be important as tcp guarantees that for every message sentthat every message will not only arrive at the receiving host but that the messages will arrive in the correct order common alternative to the tcp is the user datagram protocol (or udpudp does not provide any delivery guarantees (that is messages can be lost or may arrive out of orderhoweverudp is simpler protocol and can be particularly useful for (cspringer nature switzerland ag huntadvanced guide to python programmingundergraduate topics in computer science |
19,457 | introduction to sockets and web services broadcast systemswhere multiple clients may need to receive the data published by server host (particularly if data loss is not an issue web services web service is service offered by host computer that can be invoked by remote client using the hypertext transfer protocol (httphttp can be run over any reliable stream transport protocolalthough it is typically used over tcp/ip it was originally designed to allow data to be transferred between http server and web browser so that the data could be presented in human readable form to user howeverwhen used with web service it is used to support program to program communication between client and server using machine-readable data formats currently this format is most typically json (java script object notationalthough in the past xml (extensible markup languagewas often used addressing services every device (hostconnected to the internet has unique identity (we are ignoring private networks herethis unique identity is represented as an ip address using an ip address we can connect socket to specific host anywhere on the internet it is therefore possible to connect to whole range of device types in this way from printers to cash tills to fridges as well as serversmainframes and pcs etc ip addresses have common format such as an ip version address is always set of four numbers separated by full stops each number can be in the range - so the full range of ip addresses is from to an ip address can be divided up into two partsthe part indicating the network on which the host is connected and the host' idfor examplethusthe network id elements of the ip address identifies the specific network on which the host is currently located the host id is the part of the ip address that specifies specificities device on the network (such as your computer |
19,458 | addressing services on any given network there may be multiple hostseach with their own host id but with shared network id for exampleon private home network there may be jasmine' laptop adam' pc home printer smart tv in many ways the network id and host id elements of an ip address are like the postal address for house on street the street may have namefor example coleridge avenue and there may be multiple houses on the street each house has unique numberthus coleridge avenue is uniquely differentiated from coleridge avenue by the house number at this point you may be wondering where the urls you see in your web browser come into play (such as www bbc co ukthese are textual names that actually map to an ip address the mapping is performed by something called domain name system (or dnsserver dns server acts as lookup service to provide the actual ip address for particular textual url name the presence of an english textual version of host address is because humans are better at remembering ( hopefullymeaningful name rather than what might appear to be random sequence of numbers there are several web sites that can be used to see these mappings (and one is given at the end of this some examples of how the english textual name maps to an ip address are given belowwww aber ac uk maps to www uwe ac uk maps to www bbc net uk maps to www gov uk maps to note that these mappings were correct at the time of writingthey can change as new entries can be provided to the dns servers causing particular textual name to map to different physical host localhost there is special ip address which is usually available on host computer and is very useful for developers and testers this is the ip address it is also known as localhost which is often easier to remember |
19,459 | introduction to sockets and web services localhost (and is used to refer to the computer you are currently on when program is runthat is it is your local host computer (hence the name localhostfor exampleif you start up socket server on your local computer and want client socket programrunning on the same computerto connect to the server programyou can tell it to do so by getting it to connect to localhost this is particularly useful when either you don' know the ip address of your local computer or because the code may be run on multiple different computers each of which will have their own ip address this is particularly common if you are writing test code that will be used by developers when running their own tests on different developer (hostmachines we will be using localhost in the next two as way of specifying where to look for server program port numbers each internet device/host can typically support multiple processes it is therefore necessary to ensure that each process has its own channel of communications to do this each host has available to it multiple ports that program can connect too for example port is often reserved for http web serverswhile port is reserved for smtp servers this means that if client wants to connect to http server on particular computer then it must specify port not port on that host port number is written after the ip address of the host and separated from the address by colonfor examplewww aber ac uk: indicates port on the host machine which will typically be running http serverin this case for aberystwyth university localhost: this indicates that you wish to connect to port which is typically reserved for an imap (internet message access protocolserver on your local machine www uwe ac uk: this indicates port on host running at the university of the west of englandbristol port is usually reserved for smtp (simple mail transfer protocolservers port numbers in the ip system are bit numbers in the range - generallyport numbers below are reserved for pre-defined services (which means that you should avoid using them unless you wish to communicate with one of those services such as telnetsmtp mailftp etc therefore it is typically to choose port number above when setting up your won services |
19,460 | ipv versus ipv ipv versus ipv what we have described in this in terms of ip addresses is in fact based on the internet protocol version (aka ipv this version of the internet protocol was developed during the and published by the ietf (internet engineering task forcein september (replacing an earlier definition published in january this version of the standard uses binary bits for each element of the host address (hence the range of to for each of there parts of the addressthis provides total of billion possible unique addresses this seemed huge amount in and certainly enough for what was imagined at the time for the internet since the internet has become the backbone to not only the world wide web itselfbut also to the concept of the internet of things (in which every possible device might be connected to the internet from your fridgeto your central heating system to your toasterthis potential explosion in internet addressable deviceshosts lead in the mid as to concerns about the potential lack of internet addresses using ipv the ietf therefore designed new version of the internet protocolinternet protocol version (or ipv this was ratified as an internet standard in july ipv uses bit address for each element in hosts address it also uses eight number groups (rather than which are separated by colon each number group has four hexadecimal digits the following illustrates what an ipv address looks like : db :ac :fe :ef : ed:dd : cle uptake of the ipv protocol has been slower than was originally expectedthis is in part because the ipv and ipv have not been designed to be interoperable but also because the utilisation of the ipv addresses has not been as fast as many originally feared (partly due to the use of private networkshoweverover time this is likely to change as more organisations move over to using the ipv sockets and web services in python the next two discuss how sockets and web services can be implemented in python the first discusses both general sockets and http server sockets the second looks at how the flask library can be used to create web services that run over http using tcp/ip sockets |
19,461 | introduction to sockets and web services online resources see the following online resources for information ip addresses dns |
19,462 | sockets in python introduction socket is an end point in communication link between separate processes in python sockets are objects which provide way of exchanging information between two processes in straight forward and platform independent manner in this we will introduce the basic idea of socket communications and then presents simple socket server and client application socket to socket communication when two operating system level processes wish to communicatethey can do so via sockets each process has socket which is connected to the others socket one process can then write information out to the socketwhile the second process can read information in from the socket associated with each socket are two streamsone for input and one for output thusto pass information from one process to anotheryou write that information out to the output stream of one socket object and read it from the input stream of another socket object (assuming the two sockets are connectedseveral different types of sockets are availablehowever in this we will focus on tcp/ip sockets such socket is connection-oriented socket that will provide guarantee of delivery of data (or notification of the failure to deliver the datatcp/ipor the transmission control protocol/internet protocolis suite of communication protocols used to interconnect network devices on the internet or in private intranet tcp/ip actually specifies how data is exchanged between programs over the internet by providing end-to-end communications that identify how the data should be broken down into packetsaddressedtransmittedrouted and received at the destination (cspringer nature switzerland ag huntadvanced guide to python programmingundergraduate topics in computer science |
19,463 | sockets in python setting up connection to set up the connectionone process must be running program that is waiting for connection while the other must try to connect up to the first program the first is referred to as server socket while the second just as socket for the second process to connect to the first (the server socketit must know what machine the first is running on and which port it is connected to for examplein the above diagram the server socket connects to port in turn the client socket connects to the machine on which the server is executing and to port number on that machine nothing happens until the server socket accepts the connection at that point the sockets are connectedand the socket streams are bound to each other this means that the server' output stream is connected to the client socket input stream and vice versa an example client server application the system structure the above diagram illustrates the basic structure of the system we are trying to build there will be server object running on one machine and client object running on another the client will connect up to the server using sockets in order to obtain information the actual application being implemented in this exampleis an address book look up application the addresses of employees of company are held in dictionary this dictionary is set up in the server program but could equally be held in database etc when client connects up to the server it can obtain an employeesoffice address |
19,464 | an example client server application implementing the server application we shall describe the server application first this is the python application program that will service requests from client applications to do this it must provide server socket for clients to connect to this is done by first binding server socket to port on the server machine the server program must then listen for incoming connections the listing presents the source code for the server program import socket def main()setup names and offices addresses {'john'' ''denise'' ''phoebe'' ''adam'' 'print('starting server'print('create the socket'sock socket socket(socket af_inetsocket sock_streamprint('bind the socket to the port'server_address (socket gethostname() print('starting up on'server_addresssock bind(server_addressspecifies the number of connections allowed print('listen for incoming connections'sock listen( while trueprint('waiting for connection'connectionclient_address sock accept( |
19,465 | sockets in python tryprint('connection from'client_addresswhile truedata connection recv( decode(print('received'dataif datakey str(dataupper(response addresses[keyprint('sending data back to the client'responseconnection sendallresponse encode()elseprint('no more data from'client_addressbreak finallyconnection close(if __name__ ='__main__'main(the server in the above listing sets up the addresses to contain dictionary of the names and addresses it then waits for client to connect to it this is done by creating socket and binding it to specific port (in this case port usingprint('create the socket'sock socket socket(socket af_inetsocket sock_streamprint('bind the socket to the port'server_address (socket gethostname() the construction of the socket object is discussed in more detail in the next section next the server listens for connection from client note that the sock listen(method takes the value indicating that it will handle one connection at time an infinite loop is then set up to run the server when connection is made from clientboth the connection and the client address are made available while there is data available from the clientit is read using the recv function note that the data received from the client is assumed to be string this is then used as key to look the address up in the address dictionary |
19,466 | an example client server application once the address is obtained it can be sent back to the client in python it is necessary to decode(and encoded(the string format to the raw data transmitted via the socket streams note you should always close socket when you have finished with it socket types and domains when we created the socket class abovewe passed in two arguments to the socket constructorsocket(socket af_inetsocket sock_streamto understand the two values passed into the socket(constructor it is necessary to understand that sockets are characterised according to two propertiestheir domain and their type the domain of socket essentially defines the communications protocols that are used to transfer the data from one process to another it also incorporates how sockets are named (so that they can be referred to when establishing the communicationtwo standard domains are available on unix systemsthese are af_unix which represents intra-system communicationswhere data is moved from process to process through kernel memory buffers af_inet represents communication using the tcp/ip protocol suitein which processes may be on the same machine or on different machines socket' type indicates how the data is transferred through the socket there are essentially two options heredatagram which sockets support message-based model where no connection is involvedand communication is not guaranteed to be reliable stream sockets that support virtual circuit modelwhere data is exchanged as byte stream and the connection is reliable depending on the domainfurther socket types may be availablesuch as those that support message passing on reliable connection implementing the client application the client application is essentially very simple program that creates link to the server application to do this it creates socket object that connects to the servershost machineand in our case this socket is connected to port once connection has been made the client can then send the encoded message string to the server the server will then send back response which the client must decode it then closes the connection |
19,467 | sockets in python the implementation of the client is given belowimport socket def main()print('starting client'print('create tcp/ip socket'sock socket socket(socket af_inetsocket sock_streamprint('connect the socket to the server port'server_address (socket gethostname() print('connecting to'server_addresssock connect(server_addressprint('connected to server'tryprint('send data'message 'johnprint('sending'messagesock send(message encode()data sock recv( decode(print('received from server'datafinallyprint('closing socket'sock close(if __name__ ='__main__'main(the output from the two programs needs to be considered together |
19,468 | implementing the client application as you can see from this diagramthe server waits for connection from the client when the client connects to the serverthe server waits to receive data from the client at this point the client must wait for data to be sent to it from the server the server then sets up the response data and sends it back to the client the client receives this and prints it out and closes the connection in the meantimethe server has been waiting to see if there is any more data from the clientas the client closes the connection the server knows that the client has finished and returns to waiting for the next connection the socketserver module in the above examplethe server code is more complex than the clientand this is for single threaded serverlife can become much more complicated if the server is expected to be multi-threaded server (that is server that can handle multiple requests from different clients at the same timehoweverthe serversocket module provides more convenientobject-oriented approach to creating server much of the boiler plate code needed in such applications is defined in classeswith the developer only having to provide their own classes or override methods to define the specific functionality required there are five different server classes defined in the socketserver module baseserver is the root of the server class hierarchyit is not really intended to be instantiated and used directly instead it is extended by tcpserver and other classes tcpserver uses tcp/ip sockets to communicate and is probably the most commonly used type of socket server udpserver provides access to datagram sockets unixstreamserver and unixdatagramserver use unix-domain sockets and are only available on unix platforms responsibility for processing request is split between server class and request handler class the server deals with the communication issues (listening on socket and portaccepting connectionsetc and the request handler deals with the request issues (interpreting incoming dataprocessing itsending data back to the clientthis division of responsibility means that in many cases you can simply use one of the existing server classes without any modifications and provide custom request handler class for it to work with the following example defines request handler that is plugged into the tcpserver when it is constructed the request handler defines method handle(that will be expected to handle the request processing |
19,469 | sockets in python import socketserver class mytcphandler(socketserver baserequesthandler)""the requesthandler class for the server ""def __init__(selfrequestclient_addressserver)print('setup names and offices'self addresses {'john'' ''denise'' ''phoebe'' ''adam'' 'super(__init__(requestclient_addressserverdef handle(self)print('in handle'self request is the tcp socket connected to the client data self request recv( decode(print('data received:'datakey str(dataupper(response self addresses[keyprint('response:'responsesend the result back to the client self request sendall(response encode()def main()print('starting server'server_address ('localhost' print('creating server'server socketserver tcpserver(server_addressmytcphandlerprint('activating server'server serve_forever(if __name__ ='__main__'main(note that the previous client application does not need to change at allthe server changes are hidden from the client howeverthis is still single threaded server we can very simply make it into multi-threaded server (one that can deal with multiple requests concurrentlyby mixing the socketserver threadingmixin into the tcpserver this can be done by defining new class that is nothing more than class that extends both |
19,470 | the socketserver module threadingmixin and tcpserver and creating an instane of this new class instead of the tcpserver directly for exampleclass threadedechoserversocketserver threadingmixinsocketserver tcpserver)pass def main()print('starting'address ('localhost' server threadedechoserver(addressmytcphandlerprint('activating server'server serve_forever(in fact you do not even need to create your own class (such as the threadedechoserveras the socketserver threadingtcpserver has been provided as default mixing of the tcpserver and the threadingmixin classes we could therefore just writedef main()print('starting'address ('localhost' server socketserver threadedechoserver(addressmytcphandlerprint('activating server'server serve_forever( http server in addition to the tcpserver you also have available http server httpserverthis can be used in similar manner to the tcpserverbut is used to create servers that respond to the http protocol used by web browsers in other words it can be used to create very simple web server (although it should be noted that it is really only suitable for creating test web servers as it only implements very basic security checksit is probably worth short aside to illustrate how web server and web browser interact the following diagram illustrates the basic interactions |
19,471 | sockets in python in the above diagram the user is using browser (such as chromeie or safarito access web server the browser is running on their local machine (which could be pca maca linux boxan ipada smart phone etc to access the web server they enter url (universal resource locatoraddress into their browser in the example this is the url www foo com it also indicates that they want to connect up to port (rather than the default port used for http connectionsthe remote machine (which is the one indicated by the address www foo comreceives this request and determines what to do with it if there is no program monitoring port it will reject the request in our case we have python program (which is actually the web server programlistening to that port and it is passed the request it will then handle this request and generate response message which will be sent back to the browser on the users local machine the response will indicate which version of the http protocol it supportswhether everything went ok or not (this is the code in the above diagram you may have seen the code indicating that web page was not found etc the browser on the local machine then renders the data as web page or handles the data as appropriate etc to create simple python web server the http server httpserver can be used directly or can be subclassed along with the socketserver threadingmixin to create multi-threaded web serverfor exampleclass threadinghttpserver(threadingmixinhttpserver)"""simple multi-threaded http server ""pass since python the http server module now provides exactly this class as built in facility and it is thus no longer necessary to define it yourself (see http server threadinghttpserverto handle http requests you must implement one of the http request methods such as do_get()or do_post(each of these maps to type of http requestfor exampledo_get(maps to http get request that is generated if you type web address into the url bar of web browser or do_post(maps to http post request that is used for examplewhen form on web page is used to submit data to web server the do_get(selfor do_post(selfmethod must then handle any input supplied with the request and generate any appropriate responses back to the browser this means that it must follow the http protocol |
19,472 | http server the following short program creates simple web server that will generate welcome message and the current time as response to get request it does this by using the datetime module to create time stamp of the date and time using the today(function this is converted into byte array using the utf- character encoding (utf- is the most widely used way to represent text within web pageswe need byte array as that is what will be executed by the write(method later on having done this there are various items of meta data that need to be set up so that the browser knows what data it is about to receive this meta data is known as header data and can including the type of content being sent and the amount of data (contentbeing transmitted in our very simple case we need to tell it that we are sending it plain text (rather than the html used to describe typical web pagevia the 'content-typeheader information we also need to tell it how much data we are sending using the content length we can then indicate that we have finished defining the header information and are now sending the actual data the data itself is sent via the wfile attribute inherited from the basehttprequesthandler there are infact two related attributes rfile and wfilerfile this is an input stream that allows you to read input data (which is not being used in this examplewfile holds the output stream that can be used to write (senddata to the browser this object provides method write(that takes byte-like object that is written out to (eventuallythe browser main(method is used to set up the http server which follows the pattern used for the tcpserverhowever the client of this server will be web browser from http server import basehttprequesthandlerthreadinghttpserver from datetime import datetime class myhttprequesthandler(basehttprequesthandler)"""very simple request handler only supports get ""def do_get(self)print("do_get(starting to process request"welcome_msg 'hello from server at str(datetime today()byte_msg bytes(welcome_msg'utf- 'self send_response( self send_header("content-type"'text/plaincharsetutf- ' |
19,473 | sockets in python self send_header('content-length'str(len(byte_msg))self end_headers(print('do_get(replying with message'self wfile write(byte_msgdef main()print('setting up server'server_address ('localhost' httpd threadinghttpserver(server_addressmyhttprequesthandlerprint('activating http server'httpd serve_forever(if __name__ ='__main__'main(once the server is up and runningit is possible to connect to the server using browser and by entering an appropriate web address into the browsersurl field this means that in your browser (assuming it is running on the same machine as the above programyou only need to type into the url bar local machine at port when you do this you should see the welcome message with the current date and time |
19,474 | online resources online resources see the following online resources for information on the topics in this in python page page on socketserver documentation page on the http server module development for creating web applications exercises the aim of this exercise is to explore working with tcp/ip sockets you should create tcp server that will receive string from client check should then be made to see what information the string indicates is requiredsupported inputs are'datewhich should result in the current date being returned 'timewhich should result in the current time being returned 'date and timewhich should result in the current date and time being returned anything else should result in the input string being returned to the client in upper case with the message 'unknown option'preceding the string the result is then sent back to the client you should then create client program to call the server the client program can request input from the user in the form of string and submit that string to the server the result returned by the server should be displayed in the client before prompting the user for the next input if the user enters - as input then the program should terminate |
19,475 | sockets in python an example of the type of output the client might generate is given below to illustrate the general aim of the exercisestarting client please provide an input (datetimedataandtime or - to exit)date connected to server sending data received from serverclosing socket please provide an input (datetimedataandtime or - to exit)time connected to server sending data received from server : : closing socket please provide an input (datetimedataandtime or - to exit)dateandtime connected to server sending data received from server : : closing socket please provide an input (datetimedataandtime or - to exit)- |
19,476 | web services in python introduction this looks at restful web services as implemented using the flask framework restful services rest stands for representational state transfer and was termed coined by roy fielding in his ph to describe the lightweightresource-oriented architectural style that underpins the web fieldingone of the principle authors of httpwas looking for way of generalising the operation of http and the web the generalised the supply of web pages as form of data supplied on demand to client where the client holds the current state of an exchange based on this state information the client requests the next item of relevant data sending all information necessary to identify the information to be supplied with the request thus the requests are independent and not part of an on-going stateful conversation (hence state transferit should be noted that although fielding was aiming to create way of describing the pattern of behaviour within the webhe also had an eye on producing lighter weight web based services (than those using either proprietary enterprise integration frameworks or soap based servicesthese lighter weight http based web services have become very popular and are now widely used in many areas systems which follow these principles are termed restful services key aspect of restful service is that all interactions between client (whether some javascript running in browser or standalone applicationare done using simple http based operations http supports four operations these are http gethttp posthttp put and http delete these can be used as (cspringer nature switzerland ag huntadvanced guide to python programmingundergraduate topics in computer science |
19,477 | web services in python verbs to indicate the type of action being requested typically these are used as followsretrieve information (http get)create information (http post)update information (http put)delete information (http deleteit should be noted that rest is not standard in the way that html is standard rather it is design pattern that can be used to create web applications that can be invoked over http and that give meaning to the use of getpostput and delete http operations with respect to specific resource (or type of datathe advantage of using restful services as technologycompared to some other approaches (such as soap based services which can also be invoked over httpis that the implementations tend to be simplerthe maintenance easierthey run over standard http and https protocols and do not require expensive infrastructures and licenses to use this means that there is lower server and server side costs there is little vendor or technology dependency and clients do not need to know anything about the implementation details or technologies being used to create the services restful api restful api is one in which you must first determine the key concepts or resources being represented or managed these might be booksproducts in shoproom bookings in hotels etc for example bookstore related service might provide information on resources such as bookscdsdvdsetc within this service books are just one type of resource we will ignore the other resources such as dvds and cds etc based on the idea of book as resource we will identify suitable urls for these restful services note that although urls are frequently used to describe web page--that is just one type of resource for examplewe might develop resource such as /bookservice/book from this we could develop url based apisuch as /bookservice/bookwhere isbn (the international standard book numberindicates unique number to be used to identify specific book whose details will be returned using this url |
19,478 | restful api we also need to design the representation or formats that the service can supply these could include plain textjsonxml etc json standards for the javascript object notation and is concise way to describe data that is to be transferred from service running on server to client running in browser this is the format we will use in the next section as part of this we might identify series of operations to be provided by our services based on the type of http method used to invoke our service and the contents of the url provided for examplefor simple bookservice this might beget /book/--used to retrieve book for given isbn get /book/list--used to retrieve all current books in json format post /book (json in body of the message)--which supports creating new book put /book (json in body of message)--used to update the data held on an existing book delete /book/--used to indicate that we would like specific book deleted from the list of books held note that the parameter isbn in the above urls actually forms part of the url path python web frameworks there are very many frameworks and libraries available in python that will allow you to create json based web servicesand the shear number of options available to you can be overwhelming for exampleyou might consider flaskdjangoweb py and cherrypy to name just few these frameworks and libraries offer different sets of facilities and levels of sophistication for example django is full-stack web frameworkthat is it is aimed at developing not just web services but full blown web sites howeverfor our purposes this is probably overkill and the django rest interface is only part of much larger infrastructure that does not mean of course that we could not use django to create our bookshop serviceshowever there are simpler options available the web py is another full stack web framework which we will also discount for the same reason in contrast flask and cherrypy are considered non full-stack frameworks (although you can create full stack web application using themthis means that they are lighter weight and quicker to get started with cherrypy was original rather more focussed on providing remote function call facility that allowed functions to |
19,479 | web services in python be invoked over httphowever this has been extended to provide more rest like facilities in this we will focus on flask as it is one of the most widely used frameworks for light weight restful services in python flask flask is web development framework for python it describes itself as micro framework for python which is somewhat confusingto the point where there is page dedicated to this on their web site that explains what it means and what the implications are of this for flask according to flaskthe micro in its description relates to its primary aim of keeping the core of flask simple but extensible unlike django it doesn' include facilities aimed at helping you integrate your application with database for example instead flask focuses on the core functionality required of web service framework and allows extension to be usedas and when requiredfor additional functionality flask is also convention over configuration frameworkthat is if you follow the standard conventions then you will not need to deal with much additional configuration information (although if you wish to follow different set of conventions then you can provide configuration information to change the defaultsas most people will (at least initiallyfollow these conventions it makes it very easy to get something up and running very quickly hello world in flask as is traditional in all programming languages we will start of with simple 'hello worldstyle application this application will allow us to create very simple web service that maps particular url to function that will return json format data we will use the json data format as it is very widely used within web-based services using json json standards for javascript object notationit is light weight data-interchange format that is also easy for humans to read and write although it is derived from subset of the javascript programming languageit is in fact completely language independent and many languages and frameworks now support automatically processing of their own formats into and from json this makes it ideal for restful web services |
19,480 | hello world in flask json is actually built on some basic structuresa collection of name/value pairs in which the name and value are separated buy colon ':and each pair can be separated by comma ',an ordered list of values that are encompassed in square brackets ('[]'this makes it very easy to build up structures that represent any set of datafor example book with an isbna titleauthor and price could be represented by"author""phoebe cooke""isbn" "price" "title""javain turn list of books can be represented by comma separated set of books within square brackets for example{"author""gryff smith","isbn" "price" "title""xml"}{"author""phoebe cooke""isbn" "price" "title""java"{"author""jason procter""isbn" "price" "title"" #"}implementing flask web service there are several steps involved in creating flask web servicethese are import flask initialise the flask application implement one or more functions (or methodsto support the services you wish to publish providing routing information to route from the url to function (or method start the web service running we will look at these steps in the rest of this simple service we will now create our hello world web service to do this we must first import the flask module in this example we will use the flask class and jsonify(function elements of the module |
19,481 | web services in python we then need to create the main application object which is an instance of the flask classfrom flask import flaskjsonify app flask(__name__the argument passed into the flask(constructor is the name of the application' module or package as this is simple example we will use the __name__ attribute of the module which in this case will be '__main__in larger more complex applicationswith multiple packages and modulesthen you may need to choose an appropriate package name the flask application object implements the wsgi (web server gateway interfacestandard for python this was originally specified in pep- in and was updated for python in pep- published in it provides simple convention for how web servers should handle requests to applications the flask application object is the element that can route request for url to python function providing routing information we can now define routing information for the flask application object this information will map url to function when that url isfor exampleentered into web browsers url fieldthen the flask application object will receive that request and invoke the appropriate function to provide route mapping information we use the @app route decorator on function or method for examplein the following code the @app route decorator maps the url /hello to the function welcome(for http get requests@app route('/hello'methods=['get']def welcome()return jsonify({'msg''hello flask world'}there are two things to note about this function definitionthe @app route decorator is used to declaratively specify the routing information for the function this means that the url '/hellowill be mapped to the function welcome(the decorator also specifies the http method that is supportedin this case get requests are supported (which is actually the default so it does not need to be included here but is useful from documentation point of view |
19,482 | hello world in flask the second thing is that we are going to return our data using the json formatwe therefore use the jsonify(function and pass it python dictionary structure with single key/value pair in this case the key is 'msgand the data associated with that key is 'hello flask worldthe jsonify(function will convert this python data structure into an equivalent json structure running the service we are now ready to run our application to do this we invoke the run(method of the flask application objectapp run(debug=trueoptionally this method has keyword parameter debug that can be set to trueif this is done then when the application is run some debugging information is generated that allows you to see what is happening this can be useful in development but would not typically be used in production the whole program is presented belowfrom flask import flaskjsonify app flask(__name__@app route('/hello'methods=['get']def welcome()return jsonify({'msg''hello flask world'}app run(debug=truewhen this program is run the initial output generated is as shown belowserving flask app "hello_flask_world(lazy loadingenvironmentproduction warningthis is development server do not use it in production deployment use production wsgi server instead debug modeon running on restarting with stat debugger is activedebugger pinof course we don' see any output from our own program yet this is because we have not invoked the welcome(function via the /hello url |
19,483 | web services in python invoking the service we will use web browser to access the web service to do this we must enter the full url that will route the request to our running application and to the welcome(function the url is actually comprised of two elementsthe first part is the machine on which the application is running and the port that it is using to listen for requests this is actually listed in the above output--look at the line starting 'running onthis means that the url must start with indicates that the application is running on the computer with the ip address and listening on port we could of course also use localhost instead of the remainder of the url must then provide the information that will allow flask to route from the computer and port to the functions we want to run thus the full url is the web browser shown belowas you can see the result returned is the text we supplied to the jsonify(function but now in plain json format and displayed within the web browser you should also be able to see in the console output that request was received by the flask framework for the get request mapped to the /hello url [ /may/ : : "get /hello http/ one useful feature of this approach is that if you make change to your program then the flask framework will notice this change when running in development mode and can restart the web service with the code changes deployed if you do this you will see that the output notifies you of the changedetected change in 'hello_flask_world py'reloading restarting with stat this allows changes to be made on the fly and their effect can be immediately seen |
19,484 | hello world in flask the final solution we can tidy this example up little by defining function hat can be used to create the flask application object and by ensuring that we only run the application if the code is being run as the main modulefrom flask import flaskjsonifyurl_for def create_service()app flask(__name__@app route('/hello'methods=['get']def welcome()return jsonify({'msg''hello flask world'}with app test_request_context()print(url_for('welcome')return app if __name__ ='__main__'app create_service(app run(debug=trueone feature we have added to this program is the use of the test_request_context(the test request context object returned implements the context manager protocol and thus can be used via with statementthis is useful for debugging purposes it can be used to verify the url used for any functions with routing information specified in this case the output from the print statement is '/helloas this is the url defined by the @app route decorator online resources see the following online resources for information on the topics in this ph thesisif you are interesting in the background to rest read this frameworks for python |
19,485 | web services in python server gateway interface standard status codes |
19,486 | bookshop web service building flask bookshop service the previous illustrated the basic structure of very simple web service application we are now in position to explore the creation of set of web services for something little more realisticthe bookshop web service application in this we will implement the set of web services described earlier in the previous for very simple bookshop this means that we will define services to handle not just the get requests but also putpost and delete requests for the restful bookshop api the design before we look at the implementation of the bookshop restful api we will consider what elements we for the services services one question that often causes some confusion is how web services relate to traditional design approaches such as object oriented design the approach adopted here is that the web service api provides way to implement an interface to appropriate functionsobjects and methods used to implement the applicationdomain model this means that we will still have set of classes that will represent the bookshop and the books held within the bookshop in turn the functions implementing the web services will access the bookshop to retrievemodifyupdate and delete the books held by the bookshop (cspringer nature switzerland ag huntadvanced guide to python programmingundergraduate topics in computer science |
19,487 | bookshop web service the overall design is shown belowthis shows that book object will have an isbna titlean author and price attribute in turn the bookshop object will have books attribute that will hold zero or more books the books attribute will actually hold list as the list of books needs to change dynamically as and when new books are added or old books deleted the bookshop will also define three methods that will allow book to be obtained via its isbnallow book to be added to the list of books and enable book to be deleted (based on its isbnrouting information will be provided for set of functions that will invoke appropriate methods on the bookshop object the functions to be decorated with @app routeand the mappings to be usedare listed belowget_books(which maps to the /book/list url using the http get method request get_book(isbnwhich maps to the /bookurl where isbn is url parameter that will be passed into the function this will also use the http get request create_book(which maps to the /book url using the http post request update_book(which maps to the /book url but using the http put request delete_book(which maps to the /bookurl but using the http delete request the domain model the domain model comprises the book and bookshop classes these are presented below |
19,488 | the domain model the book class is simple value type class (that is it is data oriented with no behaviour of its own)class bookdef __init__(selfisbntitleauthorprice)self isbn isbn self title title self author author self price price def __str__(self)return self title by self author str(self pricethe bookshop class holds list of books and provides set of methods to access booksupdate books and delete booksclass bookshopdef __init__(selfbooks)self books books def get(selfisbn)if int(isbnlen(self books)abort( return list(filter(lambda bb isbn =isbnself books))[ def add_book(selfbook)self books append(bookdef delete_book(selfisbn)self books list(filter(lambda bb isbn !isbnself books)in the above codethe books attribute holds the list of books currently available the get(method returns book given specified isbn the add_book(method adds book object to the list of books the delete_book(method removes book based on its isbn the bookshop global variable holds the bookshop object initialised with default set of books |
19,489 | bookshop web service bookshop bookshop[book( 'xml''gryff smith' )book( 'java''phoebe cooke' )book( 'scala''adam davies' )book( 'python''jasmine byrne' )] encoding books into json one issue we have is that although the jsonify(function knows how to convert built in types such as stringsintegerslistsdictionaries etc into an appropriate json formatit does not know how to do this for custom types such as book we therefore need to define some way of converting book into an appropriate json format one way we could do this would be to define method that can be called to convert an instance of the book class into json format we could call this method to_json(for exampleclass book""represents book in the bookshop""def __init__(selfisbntitleauthorprice)self isbn isbn self title title self author author self price price def __str__(self)return self title by self author str(self pricedef to_json(self)return 'isbn'self isbn'title'self title'author'self author'price'self price we could now use this with the jsonify(function to convert book into the json formatjsonify({'book'book to_json()} |
19,490 | encoding books into json this approach certainly works and provides very lightweight way to convert book into json howeverthe approach presented above does mean that every time we want to jsonify book we must remember to call the to_json(method in some cases this means that we will also have to write some slightly convoluted code for example if we wish to return list of books from the bookshop as json list we might writejsonify({'books'[ to_json(for in bookshop books]}here we have used list comprehension to generate list containing the json versions of the books held in the bookshop this is starting to look overly complexeasy to forget about and probably error prone flask itself uses encoders to encode types into json flask provides way of creating your own encoders that can be used to convert custom typesuch as the book classinto json such an encoder can automatically be used by the jsonify(function to do this we must implement an encoder classthe class will extend the flask json jsonencoder superclass the class must define method default(selfobjthis method takes an object and returns the json representation of that object we can therefore write an encoder for the book class as followsclass bookjsonencoder(jsonencoder)def default(selfobj)if isinstance(objbook)return 'isbn'obj isbn'title'obj title'author'obj author'price'obj price elsereturn super(bookjsonencoderselfdefault(objthe default(method in this class checks that the object passed to it is an instance of the class book and if it is then it will create json version of the book this json structure is based on the isbntitleauthor and price attributes if it is not an instance of the book classthen it passes the object up to the parent class we can now register this encoder with the flask application object so that it will be used whenever book must be converted into json this is done by assigning the custom encoder to the flask application object via the app json_encoder attribute |
19,491 | bookshop web service app flask(__name__app json_encoder bookjsonencoder now if we wish to encode single book or list of books the above encoder will be used automatically and thus we do not need to do anything else thus our earlier examples can be written to simply by referencing the book or bookshop books attributejsonify({'book'book}jsonify({'books'bookshop books} setting up the get services we can now set up the two services that will support get requeststhese are the /book/list and /book services the functions that these urls map to are given below@app route('/book/list'methods=['get']def get_books()return jsonify({'books'bookshop books}@app route('/book/'methods=['get']def get_book(isbn)book bookshop get(isbnreturn jsonify({'book'book}the first function merely returns the current list of books held by the bookshop in json structure using the key books the second function takes an isbn number as parameter this is url parameterin other words part of the url used to invoke this function is actually dynamic and will be passed into the function this means that user can request details of books with different isbns just by changing the isbn element of the urlfor example/book/ will indicate that we want information on the book with the isbn /book/ will indicate we want information on the book with isbn in flask to indicate that something is url parameter rather than hard coded element of the urlwe use angle brackets (these surround the url parameter name and allow the parameter to be passed into the function (using the same name |
19,492 | setting up the get services in the above example we have also (optionallyindicated the type of the parameter by default the type will be stringhowever we know that the isbn is in fact an integer and so we have indicated that by prefixing the parameter name with the type int (and separated the type information from the parameter name by colon ':'there are actually several options available including string (the default)int (as used above)float for positive floating point valuesuuid for uuid strings and path which dislike string but accepts slashes we can again use browser to view the results of calling these servicesthis time the urls will be http: : /book/ for exampleas you can see from this the book information is returned as set of key/value pairs in json format |
19,493 | bookshop web service deleting book the delete book web service is very similar to the get book service in that it takes an isbn as url path parameter howeverin this case it merely returns an acknowledgement that the book was deleted successfully@app route('/book/'methods=['delete']def delete_book(isbn)bookshop delete_book(isbnreturn jsonify({'result'true}howeverwe can no longer test this just by using web browser this is because the web browser uses the http get request method for all urls entered into the url field howeverthe delete web service is associated with the http delete request method to invoke the delete_book(function we therefore need to ensure that the request that is sent uses the delete request method this can be done from client that can indicate the type of request method being used examples might include another python programa javascript web site etc for testing purposeswe will however use the curl program this program is available on most linux and mac systems and can be easily installedif it is not already availableon other operating systems the curl is command line tool and library that can be used to send and receive data over the internet it supports wide range of protocols and standards and in particular supports http and https protocols and can be used to send and receive data over http/ using different request methods for exampleto invoke the delete_book(function using the /book/ url and the http delete method we can use curl as followscurl this indicates that we want to invoke the url (and that we wish to use custom request method ( not the default getwhich is in the case delete (as indicated by the - optionthe result returned by the command is given below indicating that the book was successfully deleted "result"true we can verify this by checking the output from the /book/list url in the web browser |
19,494 | deleting book this confirms that book has been deleted adding new book we also want to support adding new book to the bookshop the details of new book could just be added to the url as url path parametershowever as the amount of data to be added grows this would become increasingly difficult to maintain and verify indeed although historically there was limit of characters in microsoft' internet explore (iewhich has theoretically be removed since ie in practice there are typically still limits on the size of the url most web servers have limit of kb (or bytesalthough this is typically configurable there may also be client side limits (such as those imposed by ie or apple' safari (which usually have kb limitif the limit is exceeded in either browser or on the serverthen most systems will just truncate the characters outside the limit (in some cases without any warningtypically such data is therefore sent in the body of the http request as part of http post request this limit on the same of post requests message body is much higher (usually up to gbthis means that it is much more reliable and safer way to transfer data to web service howeverit should be noted that this does not mean that the data is any more secure than if it is part of the urljust that it is sent in different way from the point of view of the python functions that are invoked as the result of http post method request it means that the data is not available as parameter to |
19,495 | bookshop web service the url and thus to the function insteadwithin the function it is necessary to obtain the request object and then to use that to obtain the information held within the body of the request key attribute on the request objectavailable when http request contains json datais the request json attribute this attribute contains dictionary like structure holding the values associated with the keys in the json data structure this is shown below for the create_book(function from flask import requestabort @app route('/book'methods=['post']def create_book()print('create book'if not request json or not 'isbnin request jsonabort( book book(request json['isbn']request json['title']request json get('author'"")float(request json['price'])bookshop add_book(bookreturn jsonify({'book'book}) the above function accesses the flask request object that represents the current http request the function first checks to see that it contains json data and that the isbn of the book to addis part of that json structure if it the isbn is not then the flask abort(function is called passing in suitable http response status code in this case the error code indicates that this was bad request (http error code if however the json data is present and does contain an isbn number then the values for the keys isbntitleauthor and price are obtained remember that json is dictionary like structure of keys and values thus treating it in this way makes it easy to extract the data that json structure holds it also means that we can use both method and key oriented access styles this is shown above where we use the get(method along with default value to useif an author is not specified finallyas we want to treat the price as floating point number we must use the float(function to convert the string format supplied by json into float using the data extracted we can instantiate new book instance that can be added to the bookshop as is common in web services we are returning the newly created book object as the result of creating the book along with the http response status code which indicates the successful creation of resource |
19,496 | adding new book we can now test this service using the curl command line programcurl - "content-typeapplication/json- post - '{"title":"read book""author":"bob","isbn":" ""price":" "}the options used with this command indicate the type of data being sent in the body of the request (-halong with the data to include in the body of the request (dthe result of running this command is"book""author""bob""isbn"" ""price" "title""read bookillustrating that the new book by bob has been added updating book updating book that is already held by the bookshop object is very similar to adding book except that the http put request method is used again the function implementing the required behaviour must use the flask request object to access the data submitted along with the put request howeverin this case the isbn number specified is used to find the book to be updatedrather than the specifying completely new book the update_book(function is given below@app route('/book'methods=['put']def update_book()if not request json or not 'isbnin request jsonabort( isbn request json['isbn'book bookshop get(isbnbook title request json['title'book author request json['author'book price request json['price'return jsonify({'book'book}) |
19,497 | bookshop web service this function resets the titleauthor and price of the book retrieved from the bookshop it again returns the updated book as the result of running the function the curl program can again be used to invoke this functionalthough this time the http put method must be specifiedcurl - "content-typeapplication/json- put - '{"title":"read python book""author":"bob jones","isbn":" ""price":" "}the output from this command is"book""author""bob jones""isbn"" ""price"" ""title""read python bookthis shows that book has been updated with the new information what happens if we get it wrongthe code presented for the bookshop web services is not particularly defensiveas it is possible to try to add new book with the same isbn as an existing one howeverit does check to see that an isbn number has been supplied with both the create_book(and update_book(functions howeverwhat happens if an isbn number is not suppliedin both functions we call the flask abort(function by default if this happens an error message will be sent back to the client for examplein the following command we have forgotten to include the isbn numbercurl - "content-typeapplication/json- post - '{"title":"read book""author":"tom andrews""price":" "} |
19,498 | iii example programbatch usernames summary exercises defining functions the function of functions functionsinformally future value with function functions and parametersthe exciting details getting results from function functions that return values functions that modify parameters functions and program structure summary exercises decision structures simple decisions exampletemperature warnings forming simple conditions exampleconditional program execution two-way decisions multi-way decisions exception handling study in designmax of three strategy compare each to all strategy decision tree strategy sequential processing strategy use python some lessons summary exercises loop structures and booleans for loopsa quick review indefinite loops common loop patterns interactive loops sentinel loops file loops nested loops computing with booleans boolean operators |
19,499 | contents boolean algebra other common structures post-test loop loop and half boolean expressions as decisions summary exercises simulation and design simulating racquetball simulation problem analysis and specification pseudo random numbers top-down design top-level design separation of concerns second-level design designing simngames third-level design finishing up summary of the design process bottom-up implementation unit testing simulation results other design techniques prototyping and spiral development the art of design summary exercises defining classes quick review of objects example programcannonball program specification designing the program modularizing the program defining new classes examplemulti-sided dice examplethe projectile class data processing with class objects and encapsulation encapsulating useful abstractions putting classes in modules |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.