Yes, but for that large of a data-set I would rethink the process.
Is the client's computer and server in the same (fast) local area network? If so then uploading the file to the server might be ok through the web interface, but loading 10GB of data from the file on the server to the database is going to be slow unless you do a bulk insert at the server level.
If server and client are not local to each other, then I would lean toward doing an ftp or sftp transfer of the file from the client computer to the server, then have the server do a bulk insert.
https://docs.microsoft.com/en-us/sql/t-sql/statements/bulk-insert-transact-sql?view=sql-server-ver15Is all of this data new every time, or is it really just some new records at the end and/or only a few changed/deleted records? If the majority is static, then look at doing a diff of the original load file with the new one, where you create 3 update files--one for new, one for deletions, and one for updating existing ones. Obviously this is a lot more complicated coding-wise. Not worth it probably if the data load is only once a month, but if every day would make sense to pursue.
Another thought if the file is fairly static is to use rsync or equivalent (DeltaCopy is a free windows wrapper around rsync:
http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp) to get the file changes uploaded, then drop the data and do another bulk insert. That would be the fastest and least coding. I've used DeltaCopy for years--you can use the Windows task scheduler to send the file daily or hourly or whenever from one computer to another.
...jack