Bitcoin Forum
May 03, 2024, 04:05:46 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 ... 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 [148] 149 150 151 152 »
  Print  
Author Topic: Economic Devastation  (Read 504742 times)
CoinCube (OP)
Legendary
*
Offline Offline

Activity: 1946
Merit: 1055



View Profile
June 21, 2017, 07:34:14 PM
Last edit: June 21, 2017, 07:53:13 PM by CoinCube
 #2941

Earlier in this thread there was a discussion on global warming. I have found this topic to be a particularly difficult one to follow due to its political nature and the large amount of disinformation surrounding it.

Here is a nice little 12 minute video outlining the skeptics case.

https://m.youtube.com/watch?feature=youtu.be&v=0gDErDwXqhc

By Dr. David M.W. Evans

"We check the main predictions of the climate models against the best and latest data. Fortunately the climate models got all their major predictions wrong. Why? Every serious skeptical scientist has been consistently saying essentially the same thing for over 20 years, yet most people have never heard the message. Here it is, put simply enough for any lay reader willing to pay attention..."

Dr. David M.W. Evans consulted full time for the Australian Greenhouse Office (now the Department of Climate Change) from 1999 to 2005, and part time 2008 to 2010, modeling Australia's carbon in plants, debris, mulch, soils, and forestry and agricultural products. Evans is a mathematician and engineer, with six university degrees including a PhD from Stanford University in electrical engineering. The area of human endeavor with the most experience and sophistication in dealing with feedbacks and analyzing complex systems is electrical engineering, and the most crucial and disputed aspects of understanding the climate system are the feedbacks. The evidence supporting the idea that CO2 emissions were the main cause of global warming reversed itself from 1998 to 2006, causing Evans to move from being a warmist to a skeptic.

Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714752346
Hero Member
*
Offline Offline

Posts: 1714752346

View Profile Personal Message (Offline)

Ignore
1714752346
Reply with quote  #2

1714752346
Report to moderator
1714752346
Hero Member
*
Offline Offline

Posts: 1714752346

View Profile Personal Message (Offline)

Ignore
1714752346
Reply with quote  #2

1714752346
Report to moderator
IadixDev
Full Member
***
Offline Offline

Activity: 322
Merit: 151


They're tactical


View Profile WWW
June 22, 2017, 09:57:20 AM
 #2942


I will take example with turing machin and OO programing, maybe it will be clearer what i'm talking about =) As the concept of entropy is quasi inexistant with turing machine, and like this we know we are not talking about something mystical Cheesy

And i think it can interest also shelby because he is into this sort of problematics with language design lol

The problem is this conceptions from metaphysics to organize the world based on fundemental 'objects' with properties, and 'entelechy' , which is abtracted with the OO semantic of having class of objects with properties and 'entelechy' through the alteration of its state by its methods.

So far good, but then the problem is when you want to program interaction between all the different type of object that can be present in the world, with OO programming generally it become quickly a design problem.

...

Either you do a visitor class for each pair of objects, and then each time you add a new type of object, you need to add visitor class for all the combination that the new object can interact with, but it's still bogus from metaphysical point of view because it mean the interaction between the object are not contained in the object themselves, but applied from the exterior through a visitor class that visit the two object in questions.

...

This whole design of hard typed object make emergent property very hard to program and conceptualize.

...

I would agree that in Turing machines the concept of entropy is quasi inexistant. Most of the time it is entirely absent.

Turing machines:
https://en.wikipedia.org/wiki/Turing_machine
Quote
In his 1948 essay, "Intelligent Machinery", Turing wrote that his machine consisted of:

...an unlimited memory capacity obtained in the form of an infinite tape marked out into squares, on each of which a symbol could be printed. At any moment there is one symbol in the machine; it is called the scanned symbol. The machine can alter the scanned symbol, and its behavior is in part determined by that symbol, but the symbols on the tape elsewhere do not affect the behavior of the machine. However, the tape can be moved back and forth through the machine, this being one of the elementary operations of the machine. Any symbol on the tape may therefore eventually have an innings. (Turing 1948, p. 3[18]

The underlined portion is the key reason for both a lack of emergence and subsequently the lack of conceptual entropy in Turing machines.

In a standard Turing machine the symbols on the tape do not ultimately change the nature of the machine (even if those symbols have been previously read). This is because the typical Turing machine draws from a finite table of instructions which are ultimately fixed and invariant.  

Thus the Turing machine with a fixed and finite table is a simple system regardless of how complex and long that table may be unless you allow the table of instructions to be dynamically and permanently altered based on the tape readings.

As programming languages have a fixed set of basic code they are simple Turing machines. However computer programming language in general is something more and represents a complex system. The programmers using them are the equivalent of a tape that applies dynamic updates to the instruction table. Thus over time we have seen the progression from assembly language to C++ as discussed in your links above.

I am not going to be helpful in a technical discussion of how to add emergence to a programmed system as I am not a programmer but I will address one of your points.

You appear to arguing (in the bolded section above) that if the interaction between objects are not contained in the objects themselves but requite an external observer/visitor state then the system is not valid from metaphysical point of view. If I understand you correctly you are arguing that a programmed system must be complete to be metaphysically valid.

Completeness is never possible. For a discussion on this point I would refer you to an excellent write up by Perry Marshall: The Limits of Science and Programming

“Without mathematics we cannot penetrate deeply into philosophy.
Without philosophy we cannot penetrate deeply into mathematics.
Without both we cannot penetrate deeply into anything.”

-Leibniz

The example with turing machine is to show you can have non determinism without the concept of entropy Smiley Emergent property are example of non determinist algorithm who can run on turing machine.

It's more that if you want to take a physic model to apply it with coding algorithm, and there is no algebric solution but only algorithmic ones, after it's hard to find the 'correct' model to represent the interaction to have still a minimum of consistency in the high level definition of the thing.

CoinCube (OP)
Legendary
*
Offline Offline

Activity: 1946
Merit: 1055



View Profile
June 23, 2017, 06:34:26 PM
 #2943


600watt shared this thought provoking article on the topic of the evolution of civilization and accounting and how it relates to bitcoin.

The author Daniel Jeffries appears to be a science fiction author, engineer, serial entrepreneur, and now bitcoin commentator which makes for an interesting combination.

IadixDev
Full Member
***
Offline Offline

Activity: 322
Merit: 151


They're tactical


View Profile WWW
June 23, 2017, 07:31:41 PM
 #2944


600watt shared this thought provoking article on the topic of the evolution of civilization and accounting and how it relates to bitcoin.

The author Daniel Jeffries appears to be a science fiction author, engineer, serial entrepreneur, and now bitcoin commentator which makes for an interesting combination.

I'm looking into this

http://iamcicada.com/cicada-deep-dive/

it looks super cool Smiley

the_end_is_near
Member
**
Offline Offline

Activity: 65
Merit: 10


View Profile
June 29, 2017, 09:16:03 AM
Last edit: June 29, 2017, 08:47:40 PM by the_end_is_near
 #2945

This relates to the OP of this thread.

Is the future of mobile computing small screens or docking on large screens?

The original comment was made here, but as usual I seemed to get banned where ever I go.

P.S. James A. Donald who first challenged Satoshi on the scaling (and centralization?) problem, has written about the Scalepocalypse. Yet I have those technological solutions he wishes for in an altcoin. (note side-chains are irreparably insecure). Details will be forthcoming on my Steemit (so follow me there!).





Edit: Will Millennials have to learn to do creative work, so they can work remotely to live in lower cost jurisdictions, or just continue to swipe their life away on a smartphone while sleeping on someone’s sofa? Is pulling income forward 30 years with debt not a massive bubble compounded by sovereign debt/welfare/socialism bond bubble that has finally come to its Minsky Moment?

https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Anthony-Negron-4 (Bingo!)
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Anthony-Saldana-3 (There you go!)
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Charles-Stone-6
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Mateusz-Mroov
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Nick-Chang-26
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Faith-Paul-2
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Karim-Elsheikh
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/William-Beteet-1
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Gustin-Fox-Smith
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Craig-Weiler
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Christopher-Ordway
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Michael-Brescia-2
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Ross-Wilson-20
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Torie-J-Patterson
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Grant-Schmutte
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Amy-Harris-56
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Ben-Skirvin
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Henriikka-Keskinen
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Dave-Sloan
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Anika-Pasilis (note the Millennials in UK with their leader Corbyn are trying to turn the UK towards Communism)
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Bee-Rogers
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Henry-Solomon-Crampton-Hays
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/Nikolite (interesting the perspective of a borderline Gen X/Millennial)
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answer/James-Edward-Hinds
https://www.quora.com/Do-millennials-feel-more-entitled-than-previous-generations/answers/47876582

The above comments are very interesting for getting inside the minds of Millennials. Whoa so Henriikka-Keskinen says (and you see the environmentalism indoctrination in her writing which Apple cleverly markets to) they are going to save the fucking planet with educated women and get the same fucked result we get every time we do that because…!

Mark apparently does not think that there are serious problems today that could impact the a move away from iPhone's walled garden to Android's more open ecosystem. I claim that we need large screens and keyboards to be productive and he claims that only coders need that and that smartphone users who spend money are not (and by implication do not need to become) creators. Afaics, he is essentially arguing there is no need to focus on the needs that creators might have on mobility that might differ from those who use computing only to be consumers. The implication is that 700 million iPhone users can never significantly be creators, only consumers. So where is their income going to come from?  I posit that perhaps new monetization models can enable people to earn money in more ways as creators and that such monetization (e.g. via blockchains) models might be incompatible with a walled garden, i.e. I think Apple will stifle such innovation because they want to funnel all payments through their services so that Apple can take 30% fees. No way the world is going to give Apple 30% of everything. I believe the world will become much more level playing field with fierce competition.

My point to Mark is that if everyone will only be consumers and not creators, then who will have the jobs to pay Apple $1 per day? If Apple is catering to a bankrupt world in which everyone just swipes and never creates, they will not succeed long-term. If the world has become more financially difficult, people are not going to be willing to give 30% fees on everything to Apple! Competition of the free market will route around that parasitic rent.






-- iPhone

I hope you did not drink the iPhone security and privacy FUD Koolaid. You give Apple closed-source control of your life and then claim you want anonymity and privacy. I presume you do not use the iPhone for anything you want to remain private from authorities. (A smartphone is a metadata leaking device (e.g. hotel IP address) even if you are using another device for sensitive activities.)

If you are not intalling shady apps on Android, the anti-malware advantages of Apple’s (technological and) economic totalitarianism are probably not significantly better in practice if you are upgrading your OS every 6 months or so.

http://www.zdnet.com/article/the-worlds-most-secure-smartphones-and-why-theyre-all-androids/

Docking?  Perhaps not in m y case, as I can email myself pdf's (etc.) if I need to print something from one OS to the other.

Seems you may be missing the point? By docking and having the ability of your smartphone to function as a full-fledged computer, then you do not need to repurchase the laptop every time you upgrade your CPU. In other words, over time the software utilitizes the faster CPUs so normally we upgrade our CPU. Without docking, you would have to buy a new laptop and a new smartphone, instead of just a new smartphone.

Also all of your files and work are immediately (within a second) accessible simply by plugging in your smartphone to the docking laptop. No need to lose time transferring files around. This also presumes that apps on small screens will add features for working with documents (e.g. a quick annotation added) that you primarily edit on a large screen.

Also it means you can dock on a friend’s or public access docking station (e.g. a 50" screen at a conference room) without needing to lug around another laptop.

Wireless data access (especially WiFi) may become so fast, ubiquitous, and cheap that the different devices can get the necessary documents off the cloud. But the duplication of CPU/GPUs cost argument remains (and we probably want the most powerful CPU/GPU we can afford, especially for highly creative work such as animation). Also the more we travel, the less sure we can be that in every circumstance we can get files off the cloud fast and low-cost (gouging tourists is known profit activity). And what about those rare instances when the cloud goes down, e.g. that 2016 Dyn DDoS attack that took large chunk of the entire Internet down (would suck if you have a conference presentation or other time-critical important work to do).

Even more important from my perspective, is not needing to maintain the file systems, apps, and OSes of two or more devices. That is half of the hassle. My life is already too complex. I do not want to maintain more devices than I need to. You must have extra time on your hands. I find maintaining multiple devices to be burdensome.

It is all about simplifying our lives and conserving scarce time.

Also although perhaps you can afford to buy multiple copies of CPUs, the billions in the developing world probably can not. I am talking about a world where there is much greater competition and incomes on average are much lower than they are now in the West (virtual employee gets less than 50% of quoted rates and I anticipate possibly those not-highly-specialized-expertise rates will be initially for a decade be driven down or stagnant by developing world competition and Western economic implosion). Asia is coming. My Belgium friend and I were relating how the Philippines will see incomes double or triple in the 10 years, but that still this would be unacceptable quality-of-life for our Western expectations.

My cellphone is good enough when not in my hotel, and who CREATES anything while walking the streets of Italian cities?

You never had an idea you wanted to jot down while walking around?

I think that for my next trip I will leave the iPad at home, I have used it the least.

Tablets do seem to be dying.



Personally I never wanted a tablet(almost useless to me), except for reading books, but decided to buy kindle instead.

ePaper screen (non-backlit, easier on the eyes for reading but too slow for animated pointers and such on screen) which Blutooth docks to my smartphone would be perfect.

A tablet can be useful when we want a larger screen and only want to point and swipe. But the use cases seem to be too few to justify lugging around and maintaining another device. I do not know if Blutooth docking (wireless) I/O would be fast enough so that the docking laptop screen could be detached and also run as a tablet.

Quote from: Traxo
(I don't like to install apps as well, if I don't have to - e.g reddit, I prefer browsers - multiple devices also means multiple installations)

Another motivation for an “app browser” is that users will not have to be concerned nor involved with installing and uninstalling apps. The sandbox will be secure and one will simply click an app link to run it and the caching of the code will be automatic.

Quote from: Traxo
Agree, did you check superscreen tho?

Ah, I see that is what I just wrote about above for a wireless display. It would nice if it also optionally attached to a keyboard to make a laptop or if has a built-in flip out stand and if my smartphone can accommodate a Blutooth keyboard and mouse. A tablet without an attachable keyboard, probably still needs a screen protector (or carrying case) when traveling with it.



this is why i like my note 4 with its built in stylus. pull the stylus out it automatically opens a window to draw/write with. OCR can be done on it later if needed …

When I need to sketch (as opposed to structured shapes+text drawing), I prefer a pencil. I suppose there might be a few cases where I would be okay with a stylus on a slippery screen, but I really need the friction of paper and pencil to sketch. My sketching with a stylus slips all over the place and is fugly horrible. As for text, even though I had beautiful handwriting in elementary, I can barely handwrite now as it an order-of-magnitude too slow compared to my typing speed. I loathe typing on mobile! Can become an enormous waste of time! I try to do a little as possible on mobile. This is why apps that require mostly only finger gestures for most actions are more popular on mobile. But I rarely use my Blutooth keyboard, because the setup time/hassle is greater than the occasional terse note I want to type.

… and i can sync it with evernote on my pc. annotate it with the pen or whatever.

Syncing is for me yet another step that consumes my time. I have no free time. I have a TODO list that only grows longer the older I get.

most androids can also output HDMI and accept a regular pc mouse and keyboard via a otg cable.

Yeah I remember now having researched that option years ago but a problem at that time at least, was it did not seem to work plug-and-play on every device. Some fiddling and frustration and failure. Probably glitchy too (as even Blutooth seems to be at times).

And still even if it is now reliable, we have to lug around a full size monitor which is not compact and then we have no battery option to use it unplugged. I like that laptop docking idea because it also charges the smartphone meaning the smartphone CPU will run at run speed as if it is plugged in to a wall socket.

allows a lot more work to be done with only a phone. of course still cant touch the horsepower of a pc and youre limited to android versions of software but its not too bad

Well the fact that mobile apps do not adapt well to work well both in large and small screens is one of the important aspects I want to address with an “app browser" concept for Bitnet. Because I want convergence between mobile, desktop, and browser code, so we developers can write-one and run every where.

I think I read that mobile CPUs are roughly about 1/8th of the performance of a desktop CPU fundamentally limited by TDP power dissipation (can not dissipate more than about 3 watts for extended period of time and perhaps 7W in bursts inside of a plastic mobile phone). But the faster CPUs get, the fewer computing activities we do which have a noticeable delay, so the noticeable nominal differences are shrinking over time. Also the desktop CPUs can’t increase TDP on same die and multi-core can’t always be leveraged optimally by all software, thus the gap may close proportionally over time as well. However, we must bear-in-mind that the smartphone uses for example dedicated hardware video codec processors in order to attain TDP efficiency at such computationally expensive tasks. Thus a smartphone is underpowered as a general purpose CPU for highly computationally intensive tasks such as video editing. In that case, we would need an additional processor in your docking station. But note that by leveraging the CPU and specialized processors in the smartphone, the docking station’s TPD can me much lower (more battery efficient) than a desktop. Thus the docking laptop idea seems to make more sense than lugging around an HDMI monitor that can only be used where there is an electrical plug, because the portability and battery life are another factor that make it worth leveraging the low TDP CPU and processors in the smartphone. Meaning I would like a laptop docking station with a larger screen, keyboard, and a co-processor on board. Possibly in the future, the co-processor could be on a server accessed over the wire.

https://www.quora.com/Are-smartphone-processors-finally-comparable-to-PC-processors-in-terms-of-performance/answer/Michael-Daniel-21

syncing between the phones and desktops onenote is automatic once setup.

But once again another special case device complexity between two filesystems and OSes that I have to figure out how to setup, remember how to setup if ever (i.e. a form of long-term maintenance after I long since forgotten how I set it up).

I would rather have something that works the same with all Android phones and does not require me to maintain synchronicity between two disparate systems.

as for hdmi output i was thinking more along the lines of using the tv in a hotel room. my note 4 can wirelessly use some of them via a "mirror" option but im not sure what percentage of hotel tvs would have this feature. but there is always the hdmi cable as backup. so basically otg adapter with mouse/keyboard/hdmi cable would be the minimum needed. but outside of the hotel room yup youre stuck with the phones screen.. less than optimal.. also, some phone apps do not display correctly via hdmi.

More and more complexity and Murph’s law.



@miscreanity, I can talk faster than I can type, but last time I tried it on Android, the recognition engines can’t reliably keep up with my fast speech (never tried Siri). If ever speech recognition (and latency back to the server or local computation) gets good enough, then possibly I can finally ditch the keyboard except I think it will be exhausting to speak everything I type and especially “cursor up”, “cursor down”,  “cursor to end of line”, etc.. I agree that the monitor could plausibly be replaced by a headset (or perhaps holographic projection?), except still need a docking station for the pointing device and when presenting (unless all of the audience was also wearing headsets). But in any case, we still need apps that work well at many different display sizes and modes of use (i.e. terse gestures vs. detailed manipulation), which is one of my main points here w.r.t. to my plans for an “app browser" concept for Bitnet.



I'm looking into this

http://iamcicada.com/cicada-deep-dive/

it looks super cool Smiley

I will explain why i think that is nonsense, at the appropriate time and in a venue where I am not banned.
OROBTC
Legendary
*
Offline Offline

Activity: 2912
Merit: 1852



View Profile
June 29, 2017, 09:33:48 AM
 #2946

This relates to the OP of this thread.

Is the future of mobile computing small screens or docking on large screens?

The original comment was made here, but as usual I seemed to get banned where ever I go.


That's a good question, and I can only offer observations not answers.

We are touring Italy, since I was not sure which devices would serve me best during our fairly long trip I brought them ALL along:

-- iPhone (with a temporary "plan" for overseas use)
-- iPad Mini4 (the littlest of the iPads, better resolution & screen than the phone, but mine is WIFI only)
-- my Wintel laptop (that I am writing on now)

No docking mechanism.  For traveling, it might be that we will be stuck with something with at least 12" diagonal screen measurement as I cannot efficiently type or even work the 'Net on my iPad Mini4.  The iPad is too big to carry in my pocket (like my cellphone).

I think that for my next trip I will leave the iPad at home, I have used it the least.  My cellphone is good enough when not in my hotel, and who CREATES anything while walking the streets of Italian cities?

So, my guess is cellphone for when walking around (not in hotel or meetings) and laptop for serious use.  Docking?  Perhaps not in m y case, as I can email myself pdf's (etc.) if I need to print something from one OS to the other.
Traxo
Hero Member
*****
Offline Offline

Activity: 568
Merit: 703



View Profile
June 29, 2017, 10:30:11 AM
 #2947

This relates to the OP of this thread.

Is the future of mobile computing small screens or docking on large screens?

The original comment was made here.

>Mark wrote:
  Creative work on mobile devices is increasing rapidly.

This might be true, but I doubt that results are of same or higher quality.
Just try to create music, draw/paint something on small screen, vs large screen.

>We can not do maximally productive work of nearly any kind of creative arts on a mobile device. The screen is too small, the keyboard is too slow, and the pointing device is too imprecise.

I agree with this.


Personally I never wanted a tablet(almost useless to me), except for reading books, but decided to buy kindle instead.
Perhaps something like Superscreen could come in handy.
If one wants tablet for lets say gaming (or reading in my case - mobile is too small for much text), what they really want is just larger screen.

Docking seems practical, it might be the future.
vapourminer
Legendary
*
Offline Offline

Activity: 4326
Merit: 3519


what is this "brake pedal" you speak of?


View Profile
June 29, 2017, 01:28:29 PM
 #2948


>Mark wrote:
  Creative work on mobile devices is increasing rapidly.

This might be true, but I doubt that results are of same or higher quality.
Just try to create music, draw/paint something on small screen, vs large screen.

>We can not do maximally productive work of nearly any kind of creative arts on a mobile device. The screen is too small, the keyboard is too slow, and the pointing device is too imprecise.

I agree with this.

this is why i like my note 4 with its built in stylus. pull the stylus out it automatically opens a window to draw/write with. OCR can be done on it later if needed and i can sync it with evernote on my pc. annotate it with the pen or whatever. most androids can also output HDMI and accept a regular pc mouse and keyboard via a otg cable.

allows a lot more work to be done with only a phone. of course still cant touch the horsepower of a pc and youre limited to android versions of software but its not too bad, to the point that i just take my note 4 with the otg adapter, a mouse and small keyboard on vacation, although with the note 4 stylus and largish screen i hardly use them.


miscreanity
Legendary
*
Offline Offline

Activity: 1316
Merit: 1005


View Profile
June 29, 2017, 02:47:08 PM
 #2949

This relates to the OP of this thread.

Is the future of mobile computing small screens or docking on large screens?

The original comment was made here, but as usual I seemed to get banned where ever I go.

For the immediate future, docking on large screens to do substantial work. I think non-technical creative production is likely to remain mobile and on varying size screens with a greater emphasis on physical controls and interfaces. Voice interfaces already work well enough for very basic purposes such as memos and short notes, simple instruction, etc.

Those are largely traditional arrangements. Wearable augmented displays have not yet been sufficiently miniaturized and commercialized, but they offer far greater flexibility than a docking station. Magic Leap is making progress, although there are some small outfits that are pursuing healthy compromise solutions for a moderate cost and seem to be closer to production models.

End result: some form of keyboard, perhaps foldable or rollable, combined with an AR headset and smartphone base.When laid out, the keyboard may act as physical feedback and a base for the AR system to render a desktop display of significant size while also providing creative content production ability.
vapourminer
Legendary
*
Offline Offline

Activity: 4326
Merit: 3519


what is this "brake pedal" you speak of?


View Profile
June 29, 2017, 04:47:13 PM
 #2950

this is why i like my note 4 with its built in stylus. pull the stylus out it automatically opens a window to draw/write with. OCR can be done on it later if needed …

When I need to sketch (as opposed to structured shapes+text drawing), I prefer a pencil. I suppose there might be a few cases where I would be okay with a stylus on a slippery screen, but I really need the friction of paper and pencil to sketch. My sketching with a stylus slips all over the place and is fugly horrible. As for text, even though I had beautiful handwriting in elementary, I can barely handwrite now as it an order-of-magnitude too slow compared to my typing speed. I loathe typing on mobile! Can become an enormous waste of time! I try to do a little as possible on mobile. This is why apps that require mostly only finger gestures for most actions are more popular on mobile. But I rarely use my Blutooth keyboard, because the setup time/hassle is greater than the occasional terse note I want to type.

… and i can sync it with evernote on my pc. annotate it with the pen or whatever.

Syncing is for me yet another step that consumes my time. I have no free time. I have a TODO list that only grows longer the older I get.

most androids can also output HDMI and accept a regular pc mouse and keyboard via a otg cable.

Yeah I remember now having researched that option years ago but a problem at that time at least, was it did not seem to work plug-and-play on every device. Some fiddling and frustration and failure. Probably glitchy too (as even Blutooth seems to be at times).

And still even if it is now reliable, we have to lug around a full size monitor which is not compact and then we have no battery option to use it unplugged. I like that laptop docking idea because it also charges the smartphone meaning the smartphone CPU will run at run speed as if it is plugged in to a wall socket.

allows a lot more work to be done with only a phone. of course still cant touch the horsepower of a pc and youre limited to android versions of software but its not too bad

Well the fact that mobile apps do not adapt well to work well both in large and small screens is one of the important aspects I want to address with an “app browser" concept for Bitnet. Because I want convergence between mobile, desktop, and browser code, so we developers can write-one and run every where.

i agree about the "feel" of a writing instrument; its hard to get right. the note 4 is not bad, its has a fair amount of friction.

syncing between the phones and desktops onenote is automatic once setup.

as for hdmi output i was thinking more along the lines of using the tv in a hotel room. my note 4 can wirelessly use some of them via a "mirror" option but im not sure what percentage of hotel tvs would have this feature. but there is always the hdmi cable as backup. so basically otg adapter with mouse/keyboard/hdmi cable would be the minimum needed. but outside of the hotel room yup youre stuck with the phones screen.. less than optimal.. also, some phone apps do not display correctly via hdmi.

IadixDev
Full Member
***
Offline Offline

Activity: 322
Merit: 151


They're tactical


View Profile WWW
June 30, 2017, 02:22:03 AM
 #2951

One of the main interest of mobile device is low power consumption, no heat/fan/noise with ARM cpu, it's their main interest.

If it's to be used with high power non mobile device, then better solutions can be found easily to have bigger screen, more multi tasking core etc

But probably we'll see sooner or latter more mobile device that are more like fully driveless terminal devices, with very limited processing power to be used mostly as sophisticated programmable remote control , with tactile buttons interfaces instead of mechanic ones.

But also tablets paradigm emerge from smart phones, and it's still with this idea of a mobile device to connect a wireless service, and only storing/accessing personal informations related to those service, who will already have their own data center and service side process, rather than being really comparable to full desktop or even laptop computers to have lot of processing power and complex application run stand alone on it.

It's clear to me a tablet without internet is quite useless in itself. It's useful to have social media, or news, or streaming music, without having to be connected to a power source, connected to data center via wifi =)

In the absolute, the only thing mobile device should store are personal accounting information to access remote services, and should not really be seen as stand alone computers.

the_end_is_near
Member
**
Offline Offline

Activity: 65
Merit: 10


View Profile
June 30, 2017, 10:19:18 AM
Last edit: July 01, 2017, 02:59:06 AM by the_end_is_near
 #2952

Also so I can remove the storage in an instant so that I can’t be forced to give up my encryption key. Although this reason conflicts with my statement that a smartphone is not a secure device, yet there may exist a more secure, locked down variant Android.

Meaning I would like a laptop docking station with a larger screen, keyboard, and a co-processor on board. Possibly in the future, the co-processor could be on a server accessed over the wire.

In the absolute, the only thing mobile device should store are personal accounting information to access remote services, and should not really be seen as stand alone computers.

Agreed that a docking station something like Sentio but with an extra CPU is just an interim solution for what will eventually be all processing power located remotely on servers with our computing device communicating wirelessly. If the CPU on our docking station never accepts apps (only accepts requests for computation with no buffer overflows nor privileged escalation possible) then it can be designed to be secure and thus we do not have to expend much effort maintaining it and it can be a publicly shared device.

Within the next 5 years*, our smartphone will be our secure device, but not in the way Apple is planning. Every smartphone will have essentially a Trezor on board (a minimal extra circuit adding maybe $10 in cost at high volumes and IC integration) and it will be an isolated computer from the rest of the smartphone with only a secure serial communication line. The secure hardware will be 100% open source (not this closed source BS from Apple) so that we know the Five Eyes national security apparatus have no back doors (although we need some way to verify the chip fab is not introducing back doors). Whereas, the generalized computer on the smartphone can never be a secure device because for example I read that smartphones are even vulnerable to hacking of the radio hardware on the device to for example turn on the microphone surreptitiously.

For privacy and security, our smartphone will be our identification and signing apparatus that we carry at all times (and eventually wearable and later embedded in the body 666 style). Our shortened passphrase or biometrics for signing will be rate limited and thresholded such that if stolen it will brick itself after too many failed attempts, because as I wrote before about HumanIQ and Proof-of-Person, biometrics can be fooled with synthetics. Your paper wallet will retain your master keys which you can use to recover your accounts (and invalidate the keys on the stolen smartphone) if you smartphone is lost or stolen.

And we need everyone even the fish vendors in developing countries to have a smartphone because this will be much more efficient than cash (vendors often run out of change here in the Philippines for example). So $600 iPhones are not going to fit. I know Apple’s overconfident/arrogant attitude is they do not care because they are generating huge revenues from the affluent class, but w.r.t. to currency I agree there will only be one; thus the masses will dictate the winning currency (note I am not referring to unit-of-reserve which will likely remain for example the US dollar for fiat and Bitcoin for cryptocurrency for the time being). All that do-gooder environmentalism and security FUD marketing from Apple, yet the truth is they do not give a fuck about humanity and only care about extracting maximum profits.

Apple (with the highest revenue of any company of the world, but not the largest market cap probably because of the following point) will refuse to provide this in a smartphone because it conflicts with their business model for Apple Pay. Apple’s walled garden services are their fastest growing revenue source and their market cap is highly dependent on them being able to monopolize the extraction of rents from their ecosystem. In short, Apple must own you the user, else their market cap will collapse. That is unless Apple could entirely reinvent themselves but changing culture of the employees is virtually impossible. There is absolutely no way that Apple Pay will win against permissionless blockchain payment system which nobody owns or extracts fees from. This problem has only been solving the scaling and winner-take-all centralization problems of blockchains, but I have already solved that.


Very interesting to me is our software for small screens (or more saliently less detailed gesture control due to large screens possible with wearable AR glasses) must integrate/adapt better with our software for large screens (or more saliently fine grained gesture control over interaction with the system). For the interim this can mean that the generalized computer on our smartphone is enabled with something like Sentio, so that the app (must be programmed to) adapts immediately with any screen size.

This is why apps that require mostly only finger gestures for most actions are more popular on mobile. But I rarely use my Blutooth keyboard, because the setup time/hassle is greater than the occasional terse note I want to type.


@miscreanity, I can talk faster than I can type, but last time I tried it on Android, the recognition engines can’t reliably keep up with my fast speech (never tried Siri). If ever speech recognition (and latency back to the server or local computation) gets good enough, then possibly I can finally ditch the keyboard except I think it will be exhausting to speak everything I type and especially “cursor up”, “cursor down”,  “cursor to end of line”, etc.. I agree that the monitor could plausibly be replaced by a headset (or perhaps holographic projection?), except still need a docking station for the pointing device and when presenting (unless all of the audience was also wearing headsets). But in any case, we still need apps that work well at many different display sizes and modes of use (i.e. terse gestures vs. detailed manipulation), which is one of my main points here w.r.t. to my plans for an “app browser" concept for Bitnet.

The viruses being introduced by apps on Android can be solved with my idea for an “app browser" that creates a stricter sandbox. Most apps do not need to have accesses to the low-level APIs and buffer overflow injection holes (that can enable privilege escalation attacks in order to for example gain root access) they get with the full generality of the Android OS. Also the “app browser" can be upgraded immediately on any Android phone solving the problem of latest version of Android OS not diffusing through the ecosystem of hardware. This applies the concept of separation-of-concerns; whereas, Apple attacks problems by conflating concerns into a top-down controlled jail which creates more problems. There is no need for Apple’s walled garden. Curation of qualities of apps can be done by user curation, with users signing with their cryptographic reputation to eliminate the user spoofing problem with user driven curation. Each of us can choose which moderators we follow w.r.t. to curation. All decentralized. No 1984-esque Apple Big Brother owning our lives.


* Someone raised the point of it being dubious whether people will be ready to trust their smartphones within 5 years. My thought was that within 5 years, the necessary blockchain will be in place and gaining usership, and that smartphone will become available. Not that we will achieve even 50% market adoption within 5 years. My point is that writing will be on the wall in terms of a fledgling new market within 5 years that is growing much faster than anything else and subsuming everything. Reviewing the comments made by Millennials in my prior post, it is clear to me that Millennials are hungry for change, embrace technology fully, are not loyal, and are pragmatic. Their main failing is they’ve been indoctrinated by boomers (about BS lies such as environmentalism and the value of non-STEM field education) and thus they are unrealistic about economics. But some minority or perhaps up to half up them will turn very quickly as they start to wake up. They will get on the cryptocurrency train because they have no other chance for a future. Apple corralling them into a consumer-only swiping jail on their mother’s sofa while China races away totaling obliterating Apple and laughing their heads off at the nonsense ideology of the foolish Westerners. Chinese are astute about economics and Apple is too. But Apple depends on their users being dumb (i.e. indoctrinated with lies, marketing spin, and FUD).

Someone also asked if people will not have money to pay for iPhone then they will not have money to pay for TV (in his country the citizens pay for TV and I suppose everyone does in any country if they have cable TV). I responded as follows. People will stop paying for TV and get their TV over the Internet. It is all about not wasting resources on things that are not a good return-on-investment. The clever of the Chinese think of every expenditure as a business investment. See how iFlix is undercutting Netflix in Asia ($3 monthly instead of $7 in the Philippines) and addition to Western also with Asian content not available on Netflix. And they will stop paying through the nose to Apple, when they can get everything for 1/3 the price on Android and more freedom too. Bitnet will not be available on iPhone. If the global currency everybody is using is not available on iPhone, most will stop using iPhone. Apple’s business model is all the ecosystem has to give Apple a percentage and thus iPhone is a jail tightly controlled by Apple. Whereas, since Google’s profits come primarily from advertising sales, Android’s business model is open and any .apk can be installed even if it is not on the Google Playstore.



Related, but fits into the whole privacy/security sphere would be the Operating System "Tails", a way to boot up your computer from a flash drive that would then leave zero tracks on your computer.  A link to the below comes right from the torproject.org page, so I presume this is legit.

Tor and Tails have significant flaws. I do not recommend them. Did you not read my blog (not yet finished) on anonymity?

My friend, I must say that nearly everything you post about privacy and security is wrong and unsafe. For example, you keep wanting to use off chain mixers even I have told you many times that (they are either jammable or) you must trust the mixer site to not be compromised. And then you use an iPhone too. Which shows that you are not knowledgeable of the technological issues and you just trust whatever you read. Which is why you are making so many mistakes.

I saw this at a recent thread (https://bitcointalk.org/index.php?topic=1935098.0;all), it is a new way to mix coins:

Clearnet link: https://chipmixer.com/
Tor link: http://chipmixerwzxtzbw.onion/

I have not tried it, but I hope to sometime soon.

Also re encryption, for sometime blockchain.info has NOT offered their mixing service (a version of CoinJoin I think).  I think I read that someone found a bad flaw.  Pity!

we need stuff like that ASAP.

I am happy at least one reader understands. More will understand once we get in action and they have tried it. And we have perfected blockchains and payments with them.

P.S. I am having a severe allergy problem from my toxic liver in that my immune system is attacking my only non-blinded eye. I am fearful about going blind. I stopped the liver toxic TB antibiotics (had 9 days remaining) last night. I can not keep my eyes open they hurt so much when light hits them. I think I may try to go to an eye doctor to get the pressure in the eye checked. Maybe I can get glaucoma drops to lower the pressure if that is one of the symptoms. Very scary moment for me.

Other than the eyes, I feel fine and want to work.

As you can probably tell from my writing I am exasperated that I can not work for past several years! I prefer to let my code speak.

UPDATE:

Subject: eye doctor found a serious problem

I was at eye doctor (who happens to a specialist on cornea surgery) and she said I have a 6mm x 1mm abrasion on my cornea in my only non-blinded eye. She scraped to put on slides for the laboratory, to check for bacteria, fungus, and TB, Will get results at 4pm. It is 11am now. It could be an infection or it could be autoimmune reaction. We will get a better indication once we get results from lab. She prescribed Tobramycin drops interim every hour (a gram-negative antibiotic). She said my eye is very inflamed and she is very concerned. Any way, if it is TB then we will know I have MDR-TB. I suspect Michelle brought me this infection when she returned here this past week. She has still been coughing. I told her she is not allowed to go any where any more and bringing me infections, else she can not stay with me any more. I am so tired of filipinos bringing me infections. I restarted my probiotics just now (with butter) to try to strrengthen my natural immune system. I could definitely feel the immune reaction connected to my gut, so I am suspecting maybe it is just an autoimmune reaction. Will update you later. This is very worrisome.

I also did another set of liver enyzmes and CBC blood test (so I can see how high my lymphocytes are). Will get the results later today.
vapourminer
Legendary
*
Offline Offline

Activity: 4326
Merit: 3519


what is this "brake pedal" you speak of?


View Profile
June 30, 2017, 03:50:58 PM
Last edit: June 30, 2017, 05:09:43 PM by vapourminer
 #2953

with regards to screen size and apps auto adapting i can only comment on my note 4 that i occasionally use hotel room smart tvs for display. its still hit or miss, seems the built in apps are the worse, maybe because they expect to only be run on the phones screen. 3rd party stuff is better. for instance web pages in chrome scale well and run on tvs better than the phones screen. movies via plex and vlc are of course much better on a hotel tv than the phone.

having trezor like security (i own a trezor) is going to be essential as you point out. the only way i would trust a "universal" docking station (for want of a better term) is the way you describe it, ie a single secure trusted channel that the phone controls completely. plug it in anywhere to get access to a more powerful cpu/gpu/screen and communications capability. i could see stations like this cropping up all over the place. libraries and cafes and such, linked to a crypto currency account and charging for resources used.

we need stuff like that ASAP.



Quote
Also so I can remove the storage in an instant so that I can’t be forced to give up my encryption key. Although this reason conflicts with my statement that a smartphone is not a secure device, yet there may exist a more secure, locked down variant Android.

Meaning I would like a laptop docking station with a larger screen, keyboard, and a co-processor on board. Possibly in the future, the co-processor could be on a server accessed over the wire.

In the absolute, the only thing mobile device should store are personal accounting information to access remote services, and should not really be seen as stand alone computers.

Agreed that a docking station something like Sentio but with an extra CPU is just an interim solution for what will eventually be all processing power located remotely on servers with our computing device communicating wirelessly. If the CPU on our docking station never accepts apps (only accepts requests for computation with no buffer overflows nor privileged escalation possible) then it can be designed to be secure and thus we do not have to expend much effort maintaining it and it can be a publicly shared device.

Within the next 5 years*, our smartphone will be our secure device, but not in the way Apple is planning. Every smartphone will have essentially a Trezor on board (a minimal extra circuit adding maybe $10 in cost at high volumes and IC integration) and it will be an isolated computer from the rest of the smartphone with only a secure serial communication line. The secure hardware will be 100% open source (not this closed source BS from Apple) so that we know the Five Eyes national security apparatus have no back doors (although we need some way to verify the chip fab is not introducing back doors). Whereas, the generalized computer on the smartphone can never be a secure device because for example I read that smartphones are even vulnerable to hacking of the radio hardware on the device to for example turn on the microphone surreptitiously.

For privacy and security, our smartphone will be our identification and signing apparatus that we carry at all times (and eventually wearable and later embedded in the body 666 style). Our shortened passphrase or biometrics for signing will be rate limited and thresholded such that if stolen it will brick itself after too many failed attempts, because as I wrote before about HumanIQ and Proof-of-Person, biometrics can be fooled with synthetics. Your paper wallet will retain your master keys which you can use to recover your accounts (and invalidate the keys on the stolen smartphone) if you smartphone is lost or stolen.

And we need everyone even the fish vendors in developing countries to have a smartphone because this will be much more efficient than cash (vendors often run out of change here in the Philippines for example). So $600 iPhones are not going to fit. I know Apple’s overconfident/arrogant attitude is they do not care because they are generating huge revenues from the affluent class, but w.r.t. to currency I agree there will only be one; thus the masses will dictate the winning currency (note I am not referring to unit-of-reserve which will likely remain for example the US dollar for fiat and Bitcoin for cryptocurrency for the time being). All that do-gooder environmentalism and security FUD marketing from Apple, yet the truth is they do not give a fuck about humanity and only care about extracting maximum profits.

Apple (with the highest revenue of any company of the world, but not the largest market cap probably because of the following point) will refuse to provide this in a smartphone because it conflicts with their business model for Apple Pay. Apple’s walled garden services are their fastest growing revenue source and their market cap is highly dependent on them being able to monopolize the extraction of rents from their ecosystem. In short, Apple must own you the user, else their market cap will collapse. That is unless Apple could entirely reinvent themselves but changing culture of the employees is virtually impossible. There is absolutely no way that Apple Pay will win against permissionless blockchain payment system which nobody owns or extracts fees from. This problem has only been solving the scaling and winner-take-all centralization problems of blockchains, but I have already solved that.


Very interesting to me is our software for small screens (or more saliently less detailed gesture control due to large screens possible with wearable AR glasses) must integrate/adapt better with our software for large screens (or more saliently fine grained gesture control over interaction with the system). For the interim this can mean that the generalized computer on our smartphone is enabled with something like Sentio, so that the app (must be programmed to) adapts immediately with any screen size.

This is why apps that require mostly only finger gestures for most actions are more popular on mobile. But I rarely use my Blutooth keyboard, because the setup time/hassle is greater than the occasional terse note I want to type.


@miscreanity, I can talk faster than I can type, but last time I tried it on Android, the recognition engines can’t reliably keep up with my fast speech (never tried Siri). If ever speech recognition (and latency back to the server or local computation) gets good enough, then possibly I can finally ditch the keyboard except I think it will be exhausting to speak everything I type and especially “cursor up”, “cursor down”,  “cursor to end of line”, etc.. I agree that the monitor could plausibly be replaced by a headset (or perhaps holographic projection?), except still need a docking station for the pointing device and when presenting (unless all of the audience was also wearing headsets). But in any case, we still need apps that work well at many different display sizes and modes of use (i.e. terse gestures vs. detailed manipulation), which is one of my main points here w.r.t. to my plans for an “app browser" concept for Bitnet.

The viruses being introduced by apps on Android can be solved with my idea for an “app browser" that creates a stricter sandbox. Most apps do not need to have accesses to the low-level APIs and buffer overflow injection holes (that can enable privilege escalation attacks in order to for example gain root access) they get with the full generality of the Android OS. Also the “app browser" can be upgraded immediately on any Android phone solving the problem of latest version of Android OS not diffusing through the ecosystem of hardware. This applies the concept of separation-of-concerns; whereas, Apple attacks problems by conflating concerns into a top-down controlled jail which creates more problems. There is no need for Apple’s walled garden. Curation of qualities of apps can be done by user curation, with users signing with their cryptographic reputation to eliminate the user spoofing problem with user driven curation. Each of us can choose which moderators we follow w.r.t. to curation. All decentralized. No 1984-esque Apple Big Brother owning our lives.


* Someone raised the point of it being dubious whether people will be ready to trust their smartphones within 5 years. My thought was that within 5 years, the necessary blockchain will be in place and gaining usership, and that smartphone will become available. Not that we will achieve even 50% market adoption within 5 years. My point is that writing will be on the wall in terms of a fledgling new market within 5 years that is growing much faster than anything else and subsuming everything. Reviewing the comments made by Millennials in my prior post, it is clear to me that Millennials are hungry for change, embrace technology fully, are not loyal, and are pragmatic. Their main failing is they’ve been indoctrinated by boomers (about BS lies such as environmentalism and the value of non-STEM field education) and thus they are unrealistic about economics. But some minority or perhaps up to half up them will turn very quickly as they start to wake up. They will get on the cryptocurrency train because they have no other chance for a future. Apple corralling them into a consumer-only swiping jail on their mother’s sofa while China races away totaling obliterating Apple and laughing their heads off at the nonsense ideology of the foolish Westerners. Chinese are astute about economics and Apple is too. But Apple depends on their users being dumb (i.e. indoctrinated with lies, marketing spin, and FUD).

Someone also asked if people will not have money to pay for iPhone then they will not have money to pay for TV (in his country the citizens pay for TV and I suppose everyone does in any country if they have cable TV). I responded as follows. People will stop paying for TV and get their TV over the Internet. It is all about not wasting resources on things that are not a good return-on-investment. The clever of the Chinese think of every expenditure as a business investment. See how iFlix is undercutting Netflix in Asia ($3 monthly instead of $7 in the Philippines) and addition to Western also with Asian content not available on Netflix. And they will stop paying through the nose to Apple, when they can get everything for 1/3 the price on Android and more freedom too. Bitnet will not be available on iPhone. If the global currency everybody is using is not available on iPhone, most will stop using iPhone. Apple’s business model is all the ecosystem has to give Apple a percentage and thus iPhone is a jail tightly controlled by Apple. Whereas, since Google’s profits come primarily from advertising sales, Android’s business model is open and any .apk can be installed even if it is not on the Google Playstore.
OROBTC
Legendary
*
Offline Offline

Activity: 2912
Merit: 1852



View Profile
June 30, 2017, 08:28:47 PM
 #2954

...

Related, but fits into the whole privacy/security sphere would be the Operating System "Tails", a way to boot up your computer from a flash drive that would then leave zero tracks on your computer.  A link to the below comes right from the torproject.org page, so I presume this is legit.

The main problem seems to be that it requires a LONG time to download and configure, may be complicated...

https://tails.boum.org/

I am sure many of us would really like to hear from anyone who has used Tails.
Last of the V8s
Legendary
*
Offline Offline

Activity: 1652
Merit: 4392


Be a bank


View Profile
June 30, 2017, 08:57:47 PM
 #2955

^tails is pure security theatre. all modern *nix is compromised, same as windows.
... smartphone ... (although we need some way to verify the chip fab is not introducing back doors).
gsm has backdoor
no modern chip specsheets/designs have leaked. ever. cannot verify

CoinCube (OP)
Legendary
*
Offline Offline

Activity: 1946
Merit: 1055



View Profile
June 30, 2017, 10:31:40 PM
 #2956

An interesting article over at ZeroHedge

Bob Rodriguez: "We Are Witnessing The Development Of A Perfect Storm"
http://www.zerohedge.com/news/2017-06-30/bob-rodriguez-we-are-witnessing-development-perfect-storm


IadixDev
Full Member
***
Offline Offline

Activity: 322
Merit: 151


They're tactical


View Profile WWW
June 30, 2017, 11:32:14 PM
Last edit: July 01, 2017, 12:25:05 PM by IadixDev
 #2957

One day i will explain you the trick to turn your contact less credit card into a cell phone ! (You need multi core credit card to have vidéo chat though :p)

I studied zoology, I know about specy theory and all this, it's clear cell phone and credit card are same specy, or they have a common ancestor, even if it's never very clear how this stuff really works. Cell phone are self powered credit card, in sort they can do more processing without being connected to an external power source.

They just need to find a way to have very low consumption for mobile device , ex solar powered, and limit their processing to very simple operations to communicate with wireless distributed systems, and zero boot time + no cache to only power then when they are used, or with acpi power up. And if possible without 3gb of update everyday.

THX 1138
Full Member
***
Offline Offline

Activity: 208
Merit: 103



View Profile
July 01, 2017, 01:14:17 PM
 #2958


Agreed that a docking station something like Sentio but with an extra CPU is just an interim solution for what will eventually be all processing power located remotely on servers with our computing device communicating wirelessly. If the CPU on our docking station never accepts apps (only accepts requests for computation with no buffer overflows nor privileged escalation possible) then it can be designed to be secure and thus we do not have to expend much effort maintaining it and it can be a publicly shared device.

Within the next 5 years*, our smartphone will be our secure device, but not in the way Apple is planning. Every smartphone will have essentially a Trezor on board (a minimal extra circuit adding maybe $10 in cost at high volumes and IC integration) and it will be an isolated computer from the rest of the smartphone with only a secure serial communication line. The secure hardware will be 100% open source (not this closed source BS from Apple) so that we know the Five Eyes national security apparatus have no back doors (although we need some way to verify the chip fab is not introducing back doors)...

...having trezor like security (i own a trezor) is going to be essential as you point out. the only way i would trust a "universal" docking station (for want of a better term) is the way you describe it, ie a single secure trusted channel that the phone controls completely. plug it in anywhere to get access to a more powerful cpu/gpu/screen and communications capability. i could see stations like this cropping up all over the place. libraries and cafes and such, linked to a crypto currency account and charging for resources used.

we need stuff like that ASAP.

The Sentio looks interesting; certainly something I'd consider, though I too would need a powerful CPU as I presently do a lot of laptop audio and video editing, including when travelling.

I remember coming across the Phoneblocks modular smartphone project a few years ago and was disappointed it didn't take off. I'm wondering if something similar to this could be a possibility w.r.t. a detachable Trezor-like security component?

BTW, good luck with tests results.
expertpc
Newbie
*
Offline Offline

Activity: 14
Merit: 0


View Profile
August 15, 2017, 08:12:29 PM
 #2959

The decision on expanding or contracting money supply could be decentralized, ie a pool could decide if their mined blocks will get reward or not, and miners could in decentralized way control money supply
goin2mars.
Full Member
***
Offline Offline

Activity: 121
Merit: 100


View Profile
August 16, 2017, 02:57:28 AM
 #2960

The decision on expanding or contracting money supply could be decentralized, ie a pool could decide if their mined blocks will get reward or not, and miners could in decentralized way control money supply

The cost inherent to paying overhead in mining in the first place would tend to cause anyone with an incurred cost to be incentivised to elect a high block reward to themselves at all times.

This would give them an unfair advantage over those who do not mine, incur no costs to themselves by holding the token, and do not want their money supply debased, therefore the choice is not properly decentralized as the power to debase the money supply lies with a central power. For example, approximately 48.7% of the bitcoin mining power is controlled by only two pools.

Additionally, what you're talking about is already technically the case. For example, a hard fork would provide the situation for two forks to exist. If the miners are colluding with the exchanges, then there exists the potential for even further centralization of choice. Specifically, as an example note that not every exchange currently even supports the recent bitcoin fork, and the users have not been credited this token.

me before: goo dot gl/QV7mhF
C0A2A1C4
ham
Pages: « 1 ... 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 [148] 149 150 151 152 »
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!