Create New Project on the Google Cloud Platform The first step to creating a new project to be able to access Google’s resources is to go to the Google Cloud Console https://console.cloud.google.com/ and New Project as below: Fill in the basic details that are asked on the form you get when you click New Project and submit it and you should be done. |
After create navigtate to project, like - https://console.cloud.google.com/welcome?project=project-spreadsheet-xxx In our case, since we plan to use the Google Sheets API, we will have to first enable the said API. To do that, check below screenshot click Enabled APIs & Services and then, from the top, click ENABLE APIS AND SERVICES. Or directly go to https://console.cloud.google.com/apis/library?project=project-spreadsheet-xxx to enable APIs you need. Find Google Sheets API and enable like below: |
Once enabled, we now have to create a credentials. For our case, we want to create a Service Account. Here, make sure you choose Owner as your role and then finally move ahead to create the service account. |
After the successful creation of the service account, you will be redirected to the list of credentials.
On that page, by clicking the Edit icons as shown below, you will be redirected to another page from where we can download the credentials as a JSON file.
Just follow along. Also, don’t forget to copy the email id that should be listed below the Service Accounts section.
Use this link https://console.cloud.google.com/iam-admin/settings to delete any existing project. Use this link https://console.cloud.google.com/cloud-resource-manager for all of your projects. |
Install the Google API Client Package
In order to interact with a google sheet using Composer, we will need to first bring in the Google API Client package.
Let’s do that by running the following command:
composer require google/apiclient |
ont forget to copy the email id that should be listed below the Service Accounts section. We will need this email id. So, go to your Google Cloud Console and from under Service Accounts, copy this email id. |
Now, let’s create a new Google sheet by going to docs.google.com and create a new spreadsheet. Then click the Share button on the top right of the newly created Google sheet and paste your service account email id here, to share the sheet with this email id (which is essentially your service account). like below |
Below is full PHP script to get Google Sheet data and update accordingly: |
<?php use Google\Client; use Google\Service\Drive; use Google\Service\Sheets; use Google\Service\Sheets\ValueRange; set_time_limit(60 * 60 * 12); ini_set("memory_limit", "-1"); ini_set('display_errors', 1); error_reporting(E_ALL); ini_set('display_startup_errors', 1); $client = new Client(); $client->setAuthConfig('creds.json'); $client->setScopes(['https://www.googleapis.com/auth/spreadsheets']); $client->useApplicationDefaultCredentials(); $service = new Sheets($client); $spreadsheetId = "1AN34n84blDoH_xxxxxxxxxx_faP6S5pjO5xaGdk7t77qPAgQY"; $ranges = 'Sheet1!A1:C20'; $params = array( 'ranges' => $ranges, 'majorDimension' => 'ROWS' ); //execute the request $result = $service->spreadsheets_values->batchGet($spreadsheetId, $params); if (!$result) { return [ 'error' => true, 'message' => 'missing row data' ]; } $values = $result->getValueRanges(); echo "<pre>"; print_r($values); echo "</pre>"; $updateRow = []; $updateRow[] = ["Name1", "Id1", "Group1-".time()]; $valueRange = new ValueRange(); $valueRange->setValues($updateRow); $range = 'Sheet1!A2:C'.(count($updateRow) + 1); // where the replacement will start, here, first column and second line $options = ['valueInputOption' => 'USER_ENTERED']; $service->spreadsheets_values->update($spreadsheetId, $range, $valueRange, $options); |
Respective Google Sheet is as below after update: |
Thursday, July 4, 2024
How To Read and Update Data From Google Sheets Using PHP
Monday, June 24, 2024
Gram Chara Oi Ranga Matir Poth Lyrics
গ্রামছাড়া ওই রাঙা মাটির পথ
আমার মন ভুলায় রে।
ওরে কার পানে মন হাত বাড়িয়ে
লুটিয়ে যায় ধুলায় রে আমার
মন ভুলায় রে।
গ্রামছাড়া ওই রাঙা মাটির পথ
আমার মন ভুলায় রে।
ও যে আমায় ঘরের বাহির করে,
পায়ে-পায়ে ধরে
মরি হায় হায় রে ।
ও যে কেড়ে আমায় নিয়ে যায় রে
যায় রে কোন চুলায় রে আমার
মন ভুলায় রে।
ও কোন বাঁকে কী ধন দেখাবে,
কোনখানে কী দায় ঠেকাবে---
কোথায় গিয়ে শেষ মেলে যে
ভেবেই না কুলায় রে আমার
মন ভুলায় রে।
গ্রামছাড়া ওই রাঙা মাটির পথ
আমার মন ভুলায় রে
গ্রামছাড়া ওই রাঙা মাটির পথ
আমার মন ভুলায় রে ।।
Ami Tomari Premo Vikhari Lyrics
Ami Tomari Premo Vikhari Lyrics
Ore Nil Doriya Lyrics
Ore Nil Doriya Lyrics
ওরে নীল দরিয়া
আমায় দে রে দে ছাড়িয়া
বন্দি হইয়া মনোয়া পাখি হায় রে
কান্দে রইয়া রইয়া
ওরে নীল দরিয়া
আমায় দে রে দে ছাড়িয়া
বন্দি হইয়া মনোয়া পাখি হায় রে
কান্দে রইয়া রইয়া
ওরে নীল দরিয়া
কাছের মানুষ দূরে থুইয়া
মরি আমি ধরফড়াইয়া রে
দারুণ জ্বালা দিবানিশি
দারুণ জ্বালা দিবানিশি অন্তরে অন্তরে
আমার এত সাধের মন বঁধুয়া হায় রে
কী জানি কী করে
ওরে সাম্পানের নাইয়া
আমায় দে রে দে ভিড়াইয়া
বন্দি হইয়া মনোয়া পাখি হায় রে
কান্দে রইয়া রইয়া
ওরে সাম্পানের নাইয়া
হইয়া আমি দেশান্তরী
দেশ-বিদেশে ভিড়াই তরী রে
নোঙ্গর ফেলি ঘাটে ঘাটে
নোঙ্গর ফেলি ঘাটে ঘাটে বন্দরে বন্দরে
আমার মনের নোঙ্গর পইড়া আছে হায় রে
সারেং বাড়ির ঘরে
এই না পথ ধইরা
আমি কত যে গেছি চইলা
একলা ঘরে মন বঁধুয়া আমার
রইছে পন্থ চাইয়া
Writer(s): Alom Khan, Mukul Chudhury
Sunday, December 17, 2023
Using JavaScript to Upload Large Files in Chunks as Parts and Avoid Server Limits
We know that PHP config files has the limit to upload files to server, you know that uploading large files can be a real pain. You have to find the loaded php.ini file, edit the upload_max_filesize and post_max_size settings, and hope that you never have to change servers and do all of this over again. |
We will do some trick to upload very very large file to server without manipulate any configuration, does not matter how small the limit is. We actually will send files from browser chunk by chunk. Chunk can be as small as our demand. Suppose if our server accept max 1mb request limit, we will send 1mb per request and many more. |
This will include two parts (1) is to send files chunk by chunk from browser to server and the other part (2) is save file parts into single file. I will use PHP example at backend to save file into server. |
So below is Javascript parts with description: |
<!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <title>File Upload Chunk</title> </head> <body> <div class="form"> <input type="file" id="file"/> <button type="button" id="upload">Upload</button> </div> <div class="toast"> <div id="toast"></div> </div> <script type="text/javascript"> document.getElementById("upload").addEventListener("click", () => { let files = document.getElementById("file").files; if (files.length === 0) { alert("File required"); } else { showToast("Uploading"); doUploadFileChunk(files[0], 1); } }); function doUploadFileChunk(file, chunk) { let chunkLimit = 1024 * 50; // Limit of file chunk send to server is 50 KB for test purpose // Blob from position let fromSlice = (chunk - 1) * chunkLimit; // Blob to position let nextSlice = fromSlice + chunkLimit - 1; // new FormData() let formData = new FormData(); formData.append("file_number", chunk); formData.append("file_name", file.name); formData.append("file_part", file.slice(fromSlice, nextSlice + 1)); // Processing small part of file console.log(`Sending from ${fromSlice} to ${nextSlice} KB`); // Sending to server let xhr = new XMLHttpRequest(); xhr.open("POST", "upload.php", true); xhr.setRequestHeader("Accept", "application/json"); xhr.onreadystatechange = () => { if (xhr.readyState === 4) { console.log("Status = " + xhr.status); try { let response = JSON.parse(xhr.responseText); console.log(response); if (response.uploaded) { let percent = Math.floor((nextSlice / file.size) * 100); if (nextSlice < file.size) { showToast(`Uploading file - ${percent}% completed`); setTimeout(() => { doUploadFileChunk(file, chunk + 1); }, 200); } else { showToast(`Upload completed, processing started`); } } else { alert("Failed to upload file"); } } catch (e) { console.log("Complete = " + xhr.responseText); console.log(e); alert("Failed to upload file"); } } }; xhr.send(formData); } function showToast(text) { document.getElementById("toast").innerHTML = text; } </script> <style type="text/css"> div.form { padding: 20px; } div.toast { padding: 20px; background-color: green; color: white; font-size: 31px; } </style> </body> </html> |
And respective PHP file is as below: |
<?php $output = ["uploaded" => true, "file" => $_FILES["file_part"]]; $uploadFile = sprintf("storage/%s", $_POST['file_name']); if ($_POST["file_number"] == 1) { if (file_exists($uploadFile)) { unlink($uploadFile); } } $destination = fopen($uploadFile, "a+"); if (FALSE === $destination) die("Failed to open destination"); $handle = fopen($_FILES["file_part"]["tmp_name"], "rb"); if (FALSE === $handle) die("Failed to open blob"); $BUFFER_SIZE=1*1024*1024; // 1MB, bigger is faster while( !feof($handle) ) { fwrite($destination, fread($handle, $BUFFER_SIZE) ); } fclose($handle); fclose($destination); echo json_encode($output); |
Wednesday, December 13, 2023
Auto Grow a Textarea with Javascript | How to create auto-resize textarea using JavaScript
The idea was to make a <textarea> more like a <div> so it expands in height as much as it needs to in order to contain the current value. It’s almost weird there isn’t a simple native solution for this |
The trick is that you exactly replicate the content of the <textarea> in an element that can auto expand height, and match its sizing.
Instead, you exactly replicate the look, content, and position of the element in another element. You hide the replica visually (might as well leave the one that’s technically-functional visible). |
You need to make sure the replicated element is exactly the same Same font, same padding, same margin, same border… everything. It’s an identical copy, just visually hidden with visibility: hidden;. If it’s not exactly the same, everything won’t grow together exactly right. We also need white-space: pre-wrap; on the replicated text because that is how textareas behave. |
Example is as below:
HTML Part<div class="grow-wrap"> <textarea name="text" id="text"></textarea> </div> CSS Part/* For grow textarea */ .grow-wrap { /* easy way to plop the elements on top of each other and have them both sized based on the tallest one's height */ display: grid; width: 100%; } .grow-wrap::after { /* Note the weird space! Needed to preventy jumpy behavior */ content: attr(data-replicated-value) " "; /* This is how textarea text behaves */ white-space: pre-wrap; /* Hidden from view, clicks, and screen readers */ visibility: hidden; } .grow-wrap>textarea { /* You could leave this, but after a user resizes, then it ruins the auto sizing */ resize: none; /* Firefox shows scrollbar on growth, you can hide like this. */ overflow: hidden; } .grow-wrap>textarea, .grow-wrap::after { font: inherit; grid-area: 1 / 1 / 2 / 2; margin: 0 !important; padding-top: 10px; padding-bottom: 10px; } JavaScript Partconst growers:any = document.querySelectorAll(".grow-wrap"); growers.forEach((grower:any) => { const textarea = grower.querySelector("textarea"); textarea.addEventListener("input", () => { grower.dataset.replicatedValue = textarea.value; }); }); |
Live example is as below: |
Tuesday, December 5, 2023
Streaming HTTP response in PHP - turn long-running process into realtime UI
Streaming is not a new concept, it is a data transfer technique which allows a web server to continuously send data to a client over a single HTTP connection that remains open indefinitely. In streaming response comes in chunk rather than sending them at once. In the traditional HTTP request / response cycle, a response is not transferred to the browser until it is fully prepared which makes users wait. |
Output buffering allows to have output of PHP stored into an memory (i.e. buffer) instead of immediately transmitted, it is a mechanism in which instead of sending a response immediately we buffer it in memory so that we can send it at once when whole content is ready. |
Each time using echo we are basically telling PHP to send a response to the browser, but since PHP has output buffering enabled by default that content gets buffered and not sent to the client. But we will tell PHP to send output immediately then appear rather keep them wait until execution completed. |
Php script will be like below which will usually send content chunk by chunk: |
<?php // Making maximum execution time unlimited set_time_limit(0); // Send content immediately to the browser on every statement that produces output ob_implicit_flush(1); // Deletes the topmost output buffer and outputs all of its contents ob_end_flush(); sleep(1); echo sprintf("data: %s%s", json_encode(["content" => "Stream 1"]), "\n\n"); sleep(2); echo sprintf("data: %s%s", json_encode(["content" => "Stream 2"]), "\n\n"); sleep(3); echo sprintf("data: %s%s", json_encode(["content" => "Stream 3"]), "\n\n"); exit; |
Output buffers catch output given by the program. Each new output buffer is placed on the top of a stack of output buffers, and any output it provides will be caught by the buffer below it. The output control functions handle only the topmost buffer, so the topmost buffer must be removed in order to control the buffers below it.
✔ The ob_implicit_flush(1) enables implicit flushing which sends output directly to the browser as soon as it is produced. ✔ If you need more fine grained control then use flush() function. To send data even when buffers are not full and PHP code execution is not finished we can use ob_flush and flush. The flush() function requests the server to send it's currently buffered output to the browser How to get and process the response in javascript There is a simple example how we can do it with traditional xhr ( XMLHTTPRequest ) request |
function doCallXHR() { let lastResponseLength = 0; let xhr = new XMLHttpRequest(); xhr.open("POST", "/do", true); xhr.setRequestHeader("Content-Type", "application/json"); xhr.setRequestHeader("Accept", "application/json"); xhr.onprogress = (e:any) => { let response = e.currentTarget.response; let progressResponse = lastResponseLength > 0 ? response.substring(lastResponseLength) : response; lastResponseLength = response.length; console.log(new Date().toUTCString()); progressResponse.split(/\n\n/g).filter((t) => t.trim().length > 0).forEach((line) => { console.log(JSON.parse(line.substring(6))); }); }; xhr.onreadystatechange = () => { if (xhr.readyState == 4) { console.log("Status = " + xhr.status); console.log("Complete = " + xhr.responseText); } }; xhr.send(); } doCallXHR(); |
Output is as below from browser console:Wed, 10 Jul 2024 13:33:36 GMT {content: 'Stream 1'} Wed, 10 Jul 2024 13:33:38 GMT {content: 'Stream 2'} Wed, 10 Jul 2024 13:33:41 GMT {content: 'Stream 3'} Status=200 Complete=data: {"content":"Stream 1"} data: {"content":"Stream 2"} data: {"content":"Stream 3"} |
xhr.onprogress: is the function called periodically with information until the XMLHttpRequest completely finishes
Few points to note ✅ we are sending chunk response from server and in xhr onprogress getting every new response part merged with the previously received part. ✅ it is possible to load the response one at a time as server response is multiple parts & in a format one after another. We can do it by substracting previoud response string length and parsing with JSON.parse What if any error / exception occur! how to react to that? That's easy.. catch the error & respond with a status that the front-end js script can react to |
<?php try { $response = $this->expensiveProcessing(); } catch(\Exception $e) { // Handle the exception echo json_encode([ 'success' => false, 'message' => $e->getCode() . ' - '. $e->getMessage(), 'progress' => 100 ]); ob_end_flush(); exit; } |
Configuration for Nginx You need to do few tweaking with nginx server before working with output buffering. fastcgi_buffering off; proxy_buffering off; gzip off; For whichever reason if you don't have access to nginx server configuration then from PHP code you can also achieve the same result via HTTP header header('X-Accel-Buffering: no'); |