PageRenderTime 25ms CodeModel.GetById 17ms RepoModel.GetById 0ms app.codeStats 0ms

/docs/service/s3-stream-wrapper.rst

https://gitlab.com/github-cloud-corp/aws-sdk-php
ReStructuredText | 321 lines | 226 code | 95 blank | 0 comment | 0 complexity | 5d3cc5616ebba254e3bea582196365ca MD5 | raw file
  1. ========================
  2. Amazon S3 Stream Wrapper
  3. ========================
  4. Introduction
  5. ------------
  6. The Amazon S3 stream wrapper allows you to store and retrieve data from Amazon
  7. S3 using built-in PHP functions like ``file_get_contents``, ``fopen``,
  8. ``copy``, ``rename``, ``unlink``, ``mkdir``, ``rmdir``, etc.
  9. You need to register the Amazon S3 stream wrapper in order to use it:
  10. .. code-block:: php
  11. $client = new Aws\S3\S3Client([/** options **/]);
  12. // Register the stream wrapper from an S3Client object
  13. $client->registerStreamWrapper();
  14. This allows you to access buckets and objects stored in Amazon S3 using the
  15. ``s3://`` protocol. The "s3" stream wrapper accepts strings that contain a
  16. bucket name followed by a forward slash and an optional object key or prefix:
  17. ``s3://<bucket>[/<key-or-prefix>]``.
  18. .. note::
  19. The stream wrapper is designed for working with objects and buckets on which
  20. you have at least read permission. This means that your user should have
  21. permission to execute ``ListBucket`` on any buckets and ``GetObject`` on any
  22. object with which you need to interact. For use cases where you do not have
  23. this permission level, it is recommended that you use S3 client operations
  24. directly.
  25. Downloading data
  26. ----------------
  27. You can grab the contents of an object using ``file_get_contents``. Be careful
  28. with this function though; it loads the entire contents of the object into
  29. memory.
  30. .. code-block:: php
  31. // Download the body of the "key" object in the "bucket" bucket
  32. $data = file_get_contents('s3://bucket/key');
  33. Use ``fopen()`` when working with larger files or if you need to stream data
  34. from Amazon S3.
  35. .. code-block:: php
  36. // Open a stream in read-only mode
  37. if ($stream = fopen('s3://bucket/key', 'r')) {
  38. // While the stream is still open
  39. while (!feof($stream)) {
  40. // Read 1024 bytes from the stream
  41. echo fread($stream, 1024);
  42. }
  43. // Be sure to close the stream resource when you're done with it
  44. fclose($stream);
  45. }
  46. Opening Seekable streams
  47. ~~~~~~~~~~~~~~~~~~~~~~~~
  48. Streams opened in "r" mode only allow data to be read from the stream, and are
  49. not seekable by default. This is so that data can be downloaded from Amazon S3
  50. in a truly streaming manner where previously read bytes do not need to be
  51. buffered into memory. If you need a stream to be seekable, you can pass
  52. ``seekable`` into the `stream context options <http://www.php.net/manual/en/function.stream-context-create.php>`_
  53. of a function.
  54. .. code-block:: php
  55. $context = stream_context_create([
  56. 's3' => ['seekable' => true]
  57. ]);
  58. if ($stream = fopen('s3://bucket/key', 'r', false, $context)) {
  59. // Read bytes from the stream
  60. fread($stream, 1024);
  61. // Seek back to the beginning of the stream
  62. fseek($steam, 0);
  63. // Read the same bytes that were previously read
  64. fread($stream, 1024);
  65. fclose($stream);
  66. }
  67. Opening seekable streams allows you to seek only to bytes that were previously
  68. read. You cannot skip ahead to bytes that have not yet been read from the
  69. remote server. In order to allow previously read data to recalled, data is
  70. buffered in a PHP temp stream using a stream decorator. When the amount of
  71. cached data exceed 2MB, the data in the temp stream will transfer from memory
  72. to disk. Keep this in mind when downloading large files from Amazon S3 using
  73. the ``seekable`` stream context setting.
  74. Uploading data
  75. --------------
  76. Data can be uploaded to Amazon S3 using ``file_put_contents()``.
  77. .. code-block:: php
  78. file_put_contents('s3://bucket/key', 'Hello!');
  79. You can upload larger files by streaming data using ``fopen()`` and a "w", "x",
  80. or "a" stream access mode. The Amazon S3 stream wrapper does **not** support
  81. simultaneous read and write streams (e.g. "r+", "w+", etc). This is because the
  82. HTTP protocol does not allow simultaneous reading and writing.
  83. .. code-block:: php
  84. $stream = fopen('s3://bucket/key', 'w');
  85. fwrite($stream, 'Hello!');
  86. fclose($stream);
  87. .. note::
  88. Because Amazon S3 requires a Content-Length header to be specified before
  89. the payload of a request is sent, the data to be uploaded in a PutObject
  90. operation is internally buffered using a PHP temp stream until the stream
  91. is flushed or closed.
  92. fopen modes
  93. -----------
  94. PHP's `fopen() <http://php.net/manual/en/function.fopen.php>`_ function
  95. requires that a ``$mode`` option is specified. The mode option specifies
  96. whether or not data can be read or written to a stream and if the file must
  97. exist when opening a stream. The Amazon S3 stream wrapper supports the
  98. following modes:
  99. = =============================================================================
  100. r A read only stream where the file must already exist.
  101. w A write only stream. If the file already exists it will be overwritten.
  102. a A write only stream. If the file already exists, it will be downloaded to a
  103. temporary stream and any writes to
  104. the stream will be appended to any previously uploaded data.
  105. x A write only stream. An error is raised if the file does not already exist.
  106. = =============================================================================
  107. Other object functions
  108. ----------------------
  109. Stream wrappers allow many different built-in PHP functions to work with a
  110. custom system like Amazon S3. Here are some of the functions that the Amazon S3
  111. stream wrapper allows you to perform with objects stored in Amazon S3.
  112. =============== ================================================================
  113. unlink() Delete an object from a bucket.
  114. .. code-block:: php
  115. // Delete an object from a bucket
  116. unlink('s3://bucket/key');
  117. You can pass in any options available to the ``DeleteObject``
  118. operation to modify how the object is deleted (e.g. specifying
  119. a specific object version).
  120. .. code-block:: php
  121. // Delete a specific version of an object from a bucket
  122. unlink('s3://bucket/key', stream_context_create([
  123. 's3' => ['VersionId' => '123']
  124. ]);
  125. filesize() Get the size of an object.
  126. .. code-block:: php
  127. // Get the Content-Length of an object
  128. $size = filesize('s3://bucket/key', );
  129. is_file() Checks if a URL is a file.
  130. .. code-block:: php
  131. if (is_file('s3://bucket/key')) {
  132. echo 'It is a file!';
  133. }
  134. file_exists() Checks if an object exists.
  135. .. code-block:: php
  136. if (file_exists('s3://bucket/key')) {
  137. echo 'It exists!';
  138. }
  139. filetype() Checks if a URL maps to a file or bucket (dir).
  140. file() Load the contents of an object in an array of lines. You can
  141. pass in any options available to the ``GetObject`` operation to
  142. modify how the file is downloaded.
  143. filemtime() Get the last modified date of an object.
  144. rename() Rename an object by copying the object then deleting the
  145. original. You can pass in options available to the
  146. ``CopyObject`` and ``DeleteObject`` operations to the stream
  147. context parameters to modify how the object is copied and
  148. deleted.
  149. =============== ================================================================
  150. .. note::
  151. While ``copy`` will generally work with the S3 stream wrapper, some errors
  152. may not be properly reported due to the internals of the ``copy`` function
  153. in PHP. It is recommended that you use an instance of `Aws\S3\ObjectCopier
  154. <http://docs.aws.amazon.com/aws-sdk-php/v3/api/class-Aws.S3.ObjectCopier.html>`_
  155. instead.
  156. Working with buckets
  157. --------------------
  158. You can modify and browse Amazon S3 buckets similar to how PHP allows the
  159. modification and traversal of directories on your filesystem.
  160. Here's an example of creating a bucket:
  161. .. code-block:: php
  162. mkdir('s3://bucket');
  163. You can pass in stream context options to the ``mkdir()`` method to modify how
  164. the bucket is created using the parameters available to the `CreateBucket
  165. <http://docs.aws.amazon.com/aws-sdk-php/latest/class-Aws.S3.S3Client.html#_createBucket>`_
  166. operation.
  167. .. code-block:: php
  168. // Create a bucket in the EU region
  169. mkdir('s3://bucket', stream_context_create([
  170. 's3' => ['LocationConstraint' => 'eu-west-1']
  171. ]);
  172. You can delete buckets using the ``rmdir()`` function.
  173. .. code-block:: php
  174. // Delete a bucket
  175. rmdir('s3://bucket');
  176. .. note::
  177. A bucket can only be deleted if it is empty.
  178. Listing the contents of a bucket
  179. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  180. The `opendir() <http://www.php.net/manual/en/function.opendir.php>`_,
  181. `readdir() <http://www.php.net/manual/en/function.readdir.php>`_,
  182. `rewinddir() <http://www.php.net/manual/en/function.rewinddir.php>`_, and
  183. `closedir() <http://php.net/manual/en/function.closedir.php>`_ PHP functions
  184. can be used with the Amazon S3 stream wrapper to traverse the contents of a
  185. bucket. You can pass in parameters available to the
  186. `ListObjects <http://docs.aws.amazon.com/aws-sdk-php/latest/class-Aws.S3.S3Client.html#_listObjects>`_
  187. operation as custom stream context options to the ``opendir()`` function to
  188. modify how objects are listed.
  189. .. code-block:: php
  190. $dir = "s3://bucket/";
  191. if (is_dir($dir) && ($dh = opendir($dir))) {
  192. while (($file = readdir($dh)) !== false) {
  193. echo "filename: {$file} : filetype: " . filetype($dir . $file) . "\n";
  194. }
  195. closedir($dh);
  196. }
  197. You can recursively list each object and prefix in a bucket using PHP's
  198. `RecursiveDirectoryIterator <http://php.net/manual/en/class.recursivedirectoryiterator.php>`_.
  199. .. code-block:: php
  200. $dir = 's3://bucket';
  201. $iterator = new RecursiveIteratorIterator(new RecursiveDirectoryIterator($dir));
  202. foreach ($iterator as $file) {
  203. echo $file->getType() . ': ' . $file . "\n";
  204. }
  205. Another way to list the contents of a bucket recursively that incurs fewer
  206. HTTP requests, is to use the ``Aws\recursive_dir_iterator($path, $context = null)``
  207. function.
  208. .. code-block:: php
  209. <?php
  210. require 'vendor/autoload.php';
  211. $iter = Aws\recursive_dir_iterator('s3://bucket/key');
  212. foreach ($iter as $filename) {
  213. echo $filename . "\n";
  214. }
  215. Stream context options
  216. ----------------------
  217. You can customize the client used by the stream wrapper or the cache used to
  218. cache previously loaded information about buckets and keys by passing in custom
  219. stream context options.
  220. The stream wrapper supports the following stream context options on every
  221. operation:
  222. ``client``
  223. The ``Aws\AwsClientInterface`` object to use to execute commands.
  224. ``cache``
  225. An instance of ``Aws\CacheInterface`` to use to cache previously obtained
  226. file stats. The stream wrapper will use an in-memory LRU cache by default.