Have You Ever Heard Of The Binning Technique?

Have You Ever Heard Of The Binning Technique?

Have you ever heard of the binning technique?

My favorite cartographer is John M. Nelson. In fact, he's the one who actually got me searching what 'cartography' really is. Fortunately, he's a mix of a storyteller/technical support analyst/designer. So, his techniques are the ones I have least trouble understanding. And this is by no means a comment meant to offend because really, I'm a little slow and John is a very 'generous' teacher when it comes to explaining things; even through replies in posts. You can witness his work first hand at his own blog posts here;

https://adventuresinmapping.com/

So, the first of his work that captured my attention is the Six Month Drought of the American Southeast map created using the binning method. I didn't even know what binning is, but the map was so pretty it had me announcing my loyalty to #cartography hashtags. 

So what is binning? According to GIS Lounge, binning is a data modification technique where original data values is converted into a range of small intervals called bins. Bins will then be replaced with a values that is representative of that interval to reduce the number of data points. 

Okay. It should be a no-brainer. But the data he used was the polygon shapefiles of droughts' extent and their severity. Although it is still unknown to me how USGS actually collect this data but his map is sang the deserving anthem to their hard work. But alas, I never had the chance to reproduce it. I do not have the knack of identifying interesting data of any sort, so I either am stuck with reproducing a redundant work or waste my time in a wild goose chase for data; I'm a noob with a tunnel-vision focus. I won't even vote myself if we have a jungle excursion that requires mapping cause we'll be stuck longer than necessary. 

Even so, one year later, precisely this moment...I found a valid reason to attempt this. And it's all because I need to validate satellite imagery classification some colleagues made to show hot spots of global deforestation. I am not a remote sensing wizard, but vector data...now that I can work with. 

Using the same binning technique, I can summarize the steps as follows:

Merge all the data of deforestation variables  Generate hexagonal tessellation  Create the hexagon centroids  Use 'Spatial Join' to sum up the weights of overlapping polygon features of the merged data and join it with the hexagonal centroids   Then configure symbology 

Visualizing was a herculean effort for my brain. The map John made is a bivariate map. And compared to his data which has 2 numerical variables to enable that, mine only had one and it is the summation of the ranking weight I ensued on the deforestation variables. He merged all the shapefiles of weeks after weeks of drought severity readings. Me...I just manage this >>>

Have You Ever Heard Of The Binning Technique?

My first attempt was to just visualize the probability of the deforestation using the centroid point sizes.

Have You Ever Heard Of The Binning Technique?

That wasn't much of a success because visually, it doesn't actually appeal to my comprehension. It looks good when you zoom in closer because it gives off that newspaper print feel with that basemap. From this whole extent, it's not helpful.

So, after I tried to no avail to make it work with toggling the size and the colors, I found that instead of trying to make it look nice, I better opt on answering the questions posed by my colleague; could you identify the areas of high likeliness of prolonged deforestation? For that purpose, only hexagonal mesh would do the trick. So based on the 10 km sq size of their hexagons that depicts the areas of deforestation based on image classification, I used 'Spatial Join' too again and join the centroids back their predecessor hexagons to carry the binned values. 

Et voila!

Have You Ever Heard Of The Binning Technique?

The weight summation was of the degree of prolonged deforestation likeliness and the values range all the way to 24. I made 4 intervals which gave a practical visualization. Eight intervals were pushing it and 6 was not pleasant. It could be my color palette choice that made them unappealing but too many intervals will defeat my purpose. 

Yay or nay...I'm not too sure about it. But I do believe that this summarizes the areas where conservationists should be on the alert with. 

After having a discussion with a colleague, yeah...this technique has a lot of gaps. 

ONE; this is not a point feature. Using the values where the centroid touches/overlays ONLY is not exactly a precise method. Although, it is not wrong either.

TWO; The merged polygonal data came off as OVERLAPPING polygonal features. 

Overlooking the shortcomings and just using it to visually aid cross-checking...yea maybe. Even then...it's not as laser-point precise as one would aspire. I stand humbled. 

More Posts from Azaleakamellia and Others

3 years ago

📑 International Climate Initiative (IKI) Land Use Plan: Green Initiative in the Heart of Borneo (HoB) Report

📑 International Climate Initiative (IKI) Land Use Plan: Green Initiative In The Heart Of Borneo (HoB)
📑 International Climate Initiative (IKI) Land Use Plan: Green Initiative In The Heart Of Borneo (HoB)
📑 International Climate Initiative (IKI) Land Use Plan: Green Initiative In The Heart Of Borneo (HoB)
📑 International Climate Initiative (IKI) Land Use Plan: Green Initiative In The Heart Of Borneo (HoB)

Tool: ArcGIS Pro 2.9.3 Technique: Overlay analysis, visualization via remote sensing technique

These maps are developed to aid or supplement the Natural Capital Valuation (NatCap) initiative. As cited by WWF:

An essential element of the Natural Capital Project is developing tools that help decision makers protect biodiversity and ecosystem services.

One of the site included in this initiative by WWF-Malaysia is the Heart of Borneo (HoB). Specifically for this exercise, the visualization of policy and land use eventually become the data input utilized in the tool InVest that generates the models and maps for the economic values of ecosystem services within the landscape of interest.

The generation of the data mainly includes superficial remote sensing to assess the status of the land use in the respective concessions using Sentinel-2 satellite image with specific band combination to identify tree cover, particularly mangrove forest.


Tags
2 years ago

Uninspired

Kuching City Road Network (Saturday, 10/02/2023)

I am a reckless uninspired person. I call myself a map-maker but I don't really get to make maps for reasons that I don't think I should venture outside of my requesters' requests. But mostly, I am compelled to get it right and I feel good if I can deliver what they need. The thing is, I no longer get spontaneously inspired to make maps anymore. Just as the rules become clearer the more you read books on cartography, fear just crop themselves up like 'Plant vs Zombies' 🌱 in PlayStation.

So, I am scared that I'm beginning to wear off my excitement about making map; really making them and not just knowing how to make them.

What sort of idea is great? I mean, what should I focus on trying to make? There are so many data out there that what I will attempt may be missing the train or just pale in comparison to other incredible work. I don't really mind it but I'm not that young to not understand self-esteem does ease the thinking process.

Can't say much, I mean...30 Days of Map Challenge hasn't been all that well with me. I should've prepared something before the event event started. I quit after the 3rd challenge cause I overthink and get panic attacks every time I feel I'm doing stuff half-ass.

Despite all that, I am lucky to have aggressively supportive siblings. They just can't seem to stop the tough love and always kicking me to just barf something out.

'It's the process that matters!'

When did I start forgetting how wonderful the process, huh?


Tags
2 years ago

The devil in the details

The Devil In The Details

I have started to post some videos demonstrating some tools in ArcGIS Pro. Short ones and pretty quick ones which I strived for since I absolutely am frightened with the idea of irritating people with unnecessary voice-over. It has no garnered much response and it's cool with me. Although, the lack of traction does things to my insides, I go back to the real reason I am doing thing, which is to stash the tools that I managed to learn on my own by trials and errors and keep them somewhere I can refer back to it to remember how it works.

Creating maps involves a number of iterative processes made to suit the intended output. Although creating maps itself is a form of art; heavily reliant on target audience's knowledge and aesthetical preference, it is still an inherently democratic science. Thus, knowing the mainstream technology and tools in the industry to express your vision or message is given. So for those just starting out with using geographical information software (GIS) for your final year project or research, this videos are meant for you. The purpose is not to overwhelm you with too many information, or distract you with my narration, but to follow in real-time the process from the start up of the software to the running of tools that generates the information needed.

Knowing fully well that there is an endless variety of GIS software or tools out there, processes that you need to execute to make things happen may vary in name and functionalities. Forget the beef between ArcGIS and QGIS, of which one is the better tool; if it serves your needs, then use it. You're not obliged to pledge loyalty to software or brands although you are encouraged to maintain integrity in your beliefs when it comes to corporate versus open source tools in the industry. Both choices come with their advantages and disadvantages. Yours truly uses QGIS and ArcGIS Pro interchangeably. If it doesn't work in ArcGIS Pro, which I use primarily, I'll jump to using QGIS. It's not a big deal. If it works painlessly, there is no reason to feel bad about using it.

So far, the content I have made emphasizes mostly on ArcGIS Pro or Esri products since using them is how I come to learn more about geology and geography. QGIS was a name I did not learn of in my university years when ArcGIS versions start with the digit 9️⃣, so you can catch my drift.

We can go on and on about theoretical stuff and our smarter pals usually knows what to do when faced with the tools. Unfortunately, I fall in the percentile that needed to land on the job to understand what on earth I am supposed to do. This series of videos are for those who have the same problem as I do and need to see the magic actually happening before knowing what to do. And for the most part, there are so many things to read and try out before you get it right. So hopefully, the demos can kickstart some thoughts or observation in the logic within the software's ecosystem and become more than just a technical power-user.

This week, I touched on some tools that I found helpful when dealing with point vector data, so feel free to check it out 👇🏻

Next week, I'm thinking of exploring some series of point analysis and space time cube is beckoning for me to test it out. Until then, stay cool and drop a word if you need any clarifications on the demos!


Tags
4 years ago
GitMind - Free online mind map & flowchart tool. 100+templates. Create, share and collaborate online.
Yes Peeps. I’ve Been Studying And On Contrary To All My Previous Attempts To Make Beautiful Notes,

Yes peeps. I’ve been studying and on contrary to all my previous attempts to make beautiful notes, I say f it and just work with what helps me clear my head the fastest 🏃🏻‍♀️. I love writing notes, but I realize, to gather my thoughts properly, I need some sort of way to not waste paper just to arrange and rearrange my ideas or comprehension of things. 

What better way of doing that than using a mind map!

So you kiddos out there who are starting out with Python and just can’t wait to get into deep learning or machine learning, I’d say, hold your horses for a minute and have some preview of that pond you’re trying to jump into. And don’t be scared, cause we’re all friends here in the hell-hole of learning plateau. Will it get better? I believe so. I am positive I understand more of the principles of deep learning and the relevance of Python libraries associated with it. Yes...this is a Python bar, darling. 👩🏻‍💻

There’s no real shortcut if you ask me since we have different way of comprehending things; my pre-existing mold may have harder time grasping the things I am learning right now than you would. So don’t be afraid to doodle while you think. No amount of paper will be enough to help you understand things, so better start being sustainable by using some digital platforms and saving those papers to when you’re truly ready to pen out your understanding of things; not what you read. There’s a difference!

Check out the mind map of some essential Python libraries you can get started with before you start doing some deep learning. It’s worth reviewing all that prior, I promise. 

Have fun! 🙆🏻‍♀️


Tags
1 year ago

Peta Gunatanah Malaysia 2014 - 2018

Peta Gunatanah Malaysia 2014 - 2018

Peta Gunatanah Malaysia 2014 -2018 ("Malaysia's Land Cover 2014 - 2018") web application is a platform generated for the Quality Assessment activity organized by Forest Research Institute Malaysia (FRIM) on 23rd June 2024.

The workshop aims to collect field/reference data from Malaysian's state agencies in the effort to verify the quality of the land cover classification output generated in support of CO2 release measurement from converted agricultural lands.

Participants are able access the app via conventional browsers from their mobile devices and submit drawings/sketches that they have captured within interactive data layers.

This web app aims to support direct input from source onto the task of improving the accuracy of the generated land cover maps. Vectors generated from this exercise are readily standardized with the required data scheme from quality assessment, making full use of the ArcGIS Online ecosystem full to a produce concrete output and actionable information.


Tags
1 year ago
🌱 Google Earth Engine 101

🌱 Google Earth Engine 101

Uploading a shapefile as an asset in GEE and making use of it

🟢 Beginner-friendly.

🆓 Free with no hidden monetary cost.

🤚🏻 Requires registration so sign-up 👉🏻https://signup.earthengine.google.com/, access via browser and Internet connection

🖥️ Available for Windows, Mac and Linux.

Google Earth Engine or lovingly called GEE is another free and open platform provided by Google to provide a very vast and comprehensive collection of earth observation data. Since Sentinel-2 is no longer available for download at USGS Earth Explorer, I find the alternative too challenging for me so GEE seems like the easiest way to go. If you're looking for a one-stop platform to access satellite imagery for free, GEE is a great place to start. You don't have to learn JavaScript explicitly to start using this tool.


Tags
3 years ago

Community Empowerment Strategy Dashboard 2021 | WWF-Malaysia

Community Empowerment Strategy Dashboard (2021)

Tool: Operations Dashboard ArcGIS, Survey123 for ArcGIS, ArcGIS Online Technique: XLSForm programming, web application development

The northern highland communities of Lun Bawang have been collaborating with WWF-Malaysia under the Sarawak Conservation Programme (SCP) to empower sustainable economies and managing their natural biodiversity through the Community Empowerment Strategy (formerly known as Community Engagement and Education Strategy).

Since 2016, the communities have been actively mapping out their land uses and culturally important locations to delineate their areas of settlement and source of livelihood. Given the close vicinity of their communities to the licensed timber concessions, producing a definitive map is important to preserve and conserve their surrounding natural capitals.

Several outreach has been done and the community mapping effort has been shifted to implement citizen science via the Survey123 for ArcGIS mobile application which is apart of the ArcGIS ecosystem. This enables the local community to collect information despite the lack of network reception and the data can still be synchronized upon availability automatically or manually shared with the field officers.

📌 Availability: Retracted in 2021


Tags
1 year ago

Malaysia Bid Round 2023 (MBR 2023)

Malaysia Bid Round 2023 (MBR 2023)

Tool: ArcGIS Pro 2.6.3 Technique: Symbolization, labeling and SQL expression

MBR 2023 is a peak event that culminates all the effort of data collection and stock take of hydrocarbon resource in the Malaysia. It is an annual event that put together all the exploration blocks, discoverable hydrocarbon fields and late life assets for upstream sectors to evaluate and invest in.

Malaysia Bid Round 2023 (MBR 2023)

Leading up to the event, the Malaysia Petroleum Management (MPM) updates, re-evaluate and produces maps; static and digital, to cater to the need for the most update stock-take of information that can be gained from various source of exploration output; seismic, full tensor gradiometry, assets; cables, pipelines, platforms, as well as discoverable resources. This year's them aims to include various prospects and initiative to align the industry itself with lower carbon emission and to explore the option for carbon capture storage (CCS) attempts in the popular basins such as the Malay and Penyu Basin. This is a big follow-up with the closing of MBR 2022 with the PSC signing for 9 blocks a few days earlier.

Malaysia Bid Round 2023 (MBR 2023)

Credit: Sh Shahira Wafa Syed Khairulmunir Wafa

Over ~70 maps for unique blocks have been produced during the finalization stage, ~210 maps during data evaluation and additional 20 for the event. And this excludes the standardized maps to formalize information requested by prospective bidders as well as clients who are facing prospects of extending their contract.

The standardization of the map requires the optimization of workflow and standard templates to cater to rapid changes and exporting to rapid output.

For more information on the event, please access the following resources:

PETRONAS: Malaysia Bid Round

PETRONAS myPROdata

The Malaysian Reserve: Petronas offers 10 exploration blocks in MBR 2023


Tags
2 years ago
Azalea Kamellia Abdullah on LinkedIn: #sustainability #development #greeneconomy
linkedin.com
I rarely keep record of the maps I make and my portfolio is as thick as an amoeba. But when I find them, I'm extra extra happy. There are

Tags
3 years ago

Python: Geospatial Environment Setup (Part 1)

Python: Geospatial Environment Setup (Part 1)

Here’s a quick run down of what you’re supposed to do to prepare yourself to use Python for data analysis.

Install Python ☑

Install Miniconda ☑

Install the basic Python libraries ☑

Create new environment for your workspace

Install geospatial Python libraries

🐍 Installing Python

Let’s cut to the chase. It’s December 14th, 2021. Python 3 is currently at 3.10.1 version. It’s a great milestone for Python 3 but there were heresay of issues concerning 3.10 when it comes to using it with conda. Since we’re using conda for our Python libraries and environment management, we stay safe by installing Python 3.9.5.

Download 👉🏻 Python 3.10.1 if you want to give a hand at some adventurous troubleshooting

Or download 👉🏻 Python 3.9.5 for something quite fuss-free

📌 During installation, don’t forget to ✔ the option Add Python 3.x to PATH. This enables you to access your Python from the command prompt.

Installing Miniconda

As a beginner, you’ll be informed that Anaconda is the easiest Python library manager GUI to implement conda and where it contains all the core and scientific libraries you ever need for your data analysis upon installation. So far, I believe it’s unnecessarily heavy, the GUI isn’t too friendly and I don’t use most of the pre-installed libraries. So after a few years in the darkness about it, I resorted to jump-ship and use the skimped version of conda; Miniconda.

Yes, it does come with the warning that you should have some sort of experience with Python to know what core libraries you need. And that’s the beauty of it. We’ll get to installing those libraries in the next section.

◾ If you’re skeptical about installing libraries from scratch, you can download 👉🏻 Anaconda Individual Edition directly and install it without issues; it takes some time to download due to the big file and a tad bit longer to install.

◾ Download 👉🏻 Miniconda if you’re up to the challenge.

📌 After you’ve installed Miniconda, you will find that it is installed under the Anaconda folder at your Windows Start. By this time, you will already have Python 3 and Anaconda ready in your computer. Next we’ll jump into installing the basic Python libraries necessary for core data analysis and create an environment to house the geospatial libraries.

📚 Installing core Python libraries

Core libraries for data analysis in Python are the followings:

🔺 numpy: a Python library that enables scientific computing by handling multidimensional array objects, or masked objects including matrices and all the mathematical processes involved.

🔺 pandas: enables the handling of ‘relational’ or 'labeled’ data structure in a flexible and intuitive manner. Basically enables the handling of data in a tabular structure similar to what we see in Excel.

🔺matplotlib: a robust library that helps with the visualization of data; static, animated or interactive. It’s a fun library to explore.

🔺 seaborn: another visualization library that is built based on matplotlib which is more high-level and produces more crowd-appealing visualization. Subject to preference though.

🔺 jupyter lab: a web-based user interface for Project Jupyter where you can work with documents, text editors, terminals and or Jupyter Notebooks. We are installing this library to tap into the notebook package that is available with this library installation

To start installing:

1️⃣ At Start, access the Anaconda folder > Select Anaconda Prompt (miniconda3)

2️⃣ An Anaconda Prompt window similar to Windows command prompt will open > Navigate to the folder you would like to keep your analytics workspace using the following common command prompt codes:

◽ To backtrack folder location 👇🏻

To backtrack folder locations

◽ Change the current drive, to x drive 👇🏻

Python: Geospatial Environment Setup (Part 1)

◽ Navigate to certain folders of interest e.g deeper from Lea folder i.e Lea\folder_x\folder_y 👇🏻

Python: Geospatial Environment Setup (Part 1)

3️⃣ Once navigated to the folder of choice, you can start installing all of the libraries in a single command as follows:

Python: Geospatial Environment Setup (Part 1)

The command above will enable the simultaneous installation of all the essential Python libraries needed by any data scientists.

💀 Should there be any issues during the installation such as uncharacteristically long installation time; 1 hour is stretching it, press Ctrl + c to cancel any pending processes and proceed to retry by installing the library one by one i.e

Python: Geospatial Environment Setup (Part 1)

Once you manage to go through the installation of the basic Python libraries above, you are half way there! With these packages, you are already set to actually make some pretty serious data analysis. The numpy, pandas and matplotlib libraries are the triple threat for exploratory data analysis (EDA) processes and the jupyter lab library provides the documentation sans coding notebook that is shareable and editable among team mates or colleagues.

Since we’re the folks who like to make ourselves miserable with the spatial details of our data, we will climb up another 2 hurdles to creating a geospatial workspace using conda and installing the libraries needed for geospatial EDA.

If you're issues following the steps here, check out the real-time demonstration of the installations at this link 👇🏻

See you guys in part 2 soon!


Tags
Loading...
End of content
No more pages to load
azaleakamellia - anecdata
anecdata

#gischat #eo #running #simblr #cartokantoi

45 posts

Explore Tumblr Blog
Search Through Tumblr Tags