GPU based algorithm on AWS Lambda
Currently lambda doesn't have GPU.
However, if you just need to do inference; the emulation via CPU works fine on AWS lambda; here is an article that goes into more details:
https://aws.amazon.com/blogs/machine-learning/how-to-deploy-deep-learning-models-with-aws-lambda-and-tensorflow/
You can't specify the runtime environment for AWS Lambda functions, so no, you can't require the presence of a GPU (in fact the physical machines AWS chooses to put into its Lambda pool will almost certainly not have one).
Your best bet would be to run the GPU-requiring function as a Batch job on a compute cluster configured to use p-type instances. The guide here might be helpful.